Oct 09 10:27:39 crc systemd[1]: Starting Kubernetes Kubelet... Oct 09 10:27:39 crc restorecon[4672]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:39 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 10:27:40 crc restorecon[4672]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 09 10:27:41 crc kubenswrapper[4740]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 10:27:41 crc kubenswrapper[4740]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 09 10:27:41 crc kubenswrapper[4740]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 10:27:41 crc kubenswrapper[4740]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 10:27:41 crc kubenswrapper[4740]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 09 10:27:41 crc kubenswrapper[4740]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.474573 4740 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483596 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483646 4740 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483655 4740 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483667 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483675 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483684 4740 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483692 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483700 4740 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483709 4740 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483717 4740 feature_gate.go:330] unrecognized feature gate: Example Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483725 4740 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483733 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483741 4740 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483749 4740 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483781 4740 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483789 4740 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483797 4740 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483804 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483812 4740 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483820 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483828 4740 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483835 4740 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483844 4740 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483851 4740 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483859 4740 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483867 4740 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483914 4740 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483927 4740 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483937 4740 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483946 4740 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483954 4740 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483965 4740 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483975 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483984 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.483992 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484000 4740 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484008 4740 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484016 4740 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484024 4740 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484032 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484039 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484050 4740 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484062 4740 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484071 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484079 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484088 4740 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484098 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484106 4740 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484114 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484122 4740 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484130 4740 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484137 4740 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484145 4740 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484152 4740 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484160 4740 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484169 4740 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484177 4740 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484187 4740 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484197 4740 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484207 4740 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484215 4740 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484225 4740 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484233 4740 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484241 4740 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484249 4740 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484261 4740 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484272 4740 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484282 4740 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484290 4740 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484298 4740 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.484306 4740 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484467 4740 flags.go:64] FLAG: --address="0.0.0.0" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484492 4740 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484512 4740 flags.go:64] FLAG: --anonymous-auth="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484536 4740 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484549 4740 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484559 4740 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484571 4740 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484582 4740 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484591 4740 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484601 4740 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484611 4740 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484622 4740 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484632 4740 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484641 4740 flags.go:64] FLAG: --cgroup-root="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484649 4740 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484658 4740 flags.go:64] FLAG: --client-ca-file="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484667 4740 flags.go:64] FLAG: --cloud-config="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484679 4740 flags.go:64] FLAG: --cloud-provider="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484688 4740 flags.go:64] FLAG: --cluster-dns="[]" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484698 4740 flags.go:64] FLAG: --cluster-domain="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484707 4740 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484716 4740 flags.go:64] FLAG: --config-dir="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484726 4740 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484735 4740 flags.go:64] FLAG: --container-log-max-files="5" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484746 4740 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484787 4740 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484797 4740 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484806 4740 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484815 4740 flags.go:64] FLAG: --contention-profiling="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484825 4740 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484834 4740 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484844 4740 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484853 4740 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484864 4740 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484873 4740 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484882 4740 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484891 4740 flags.go:64] FLAG: --enable-load-reader="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484901 4740 flags.go:64] FLAG: --enable-server="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484909 4740 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484922 4740 flags.go:64] FLAG: --event-burst="100" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484931 4740 flags.go:64] FLAG: --event-qps="50" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484940 4740 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484950 4740 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484958 4740 flags.go:64] FLAG: --eviction-hard="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484970 4740 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484978 4740 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484987 4740 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.484997 4740 flags.go:64] FLAG: --eviction-soft="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485006 4740 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485016 4740 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485025 4740 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485034 4740 flags.go:64] FLAG: --experimental-mounter-path="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485043 4740 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485052 4740 flags.go:64] FLAG: --fail-swap-on="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485061 4740 flags.go:64] FLAG: --feature-gates="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485073 4740 flags.go:64] FLAG: --file-check-frequency="20s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485083 4740 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485093 4740 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485103 4740 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485113 4740 flags.go:64] FLAG: --healthz-port="10248" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485123 4740 flags.go:64] FLAG: --help="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485132 4740 flags.go:64] FLAG: --hostname-override="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485141 4740 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485150 4740 flags.go:64] FLAG: --http-check-frequency="20s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485159 4740 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485168 4740 flags.go:64] FLAG: --image-credential-provider-config="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485177 4740 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485186 4740 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485195 4740 flags.go:64] FLAG: --image-service-endpoint="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485203 4740 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485212 4740 flags.go:64] FLAG: --kube-api-burst="100" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485221 4740 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485231 4740 flags.go:64] FLAG: --kube-api-qps="50" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485239 4740 flags.go:64] FLAG: --kube-reserved="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485248 4740 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485257 4740 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485266 4740 flags.go:64] FLAG: --kubelet-cgroups="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485275 4740 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485284 4740 flags.go:64] FLAG: --lock-file="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485292 4740 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485301 4740 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485312 4740 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485335 4740 flags.go:64] FLAG: --log-json-split-stream="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485345 4740 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485355 4740 flags.go:64] FLAG: --log-text-split-stream="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485363 4740 flags.go:64] FLAG: --logging-format="text" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485372 4740 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485382 4740 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485390 4740 flags.go:64] FLAG: --manifest-url="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485400 4740 flags.go:64] FLAG: --manifest-url-header="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485413 4740 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485422 4740 flags.go:64] FLAG: --max-open-files="1000000" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485433 4740 flags.go:64] FLAG: --max-pods="110" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485441 4740 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485451 4740 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485460 4740 flags.go:64] FLAG: --memory-manager-policy="None" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485468 4740 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485478 4740 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485487 4740 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485497 4740 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485517 4740 flags.go:64] FLAG: --node-status-max-images="50" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485526 4740 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485535 4740 flags.go:64] FLAG: --oom-score-adj="-999" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485544 4740 flags.go:64] FLAG: --pod-cidr="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485553 4740 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485566 4740 flags.go:64] FLAG: --pod-manifest-path="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485575 4740 flags.go:64] FLAG: --pod-max-pids="-1" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485585 4740 flags.go:64] FLAG: --pods-per-core="0" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485593 4740 flags.go:64] FLAG: --port="10250" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485603 4740 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485612 4740 flags.go:64] FLAG: --provider-id="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485621 4740 flags.go:64] FLAG: --qos-reserved="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485630 4740 flags.go:64] FLAG: --read-only-port="10255" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485639 4740 flags.go:64] FLAG: --register-node="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485649 4740 flags.go:64] FLAG: --register-schedulable="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485659 4740 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485675 4740 flags.go:64] FLAG: --registry-burst="10" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485684 4740 flags.go:64] FLAG: --registry-qps="5" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485693 4740 flags.go:64] FLAG: --reserved-cpus="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485703 4740 flags.go:64] FLAG: --reserved-memory="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485714 4740 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485724 4740 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485733 4740 flags.go:64] FLAG: --rotate-certificates="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485742 4740 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485776 4740 flags.go:64] FLAG: --runonce="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485786 4740 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485796 4740 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485805 4740 flags.go:64] FLAG: --seccomp-default="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485814 4740 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485822 4740 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485832 4740 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485844 4740 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485853 4740 flags.go:64] FLAG: --storage-driver-password="root" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485862 4740 flags.go:64] FLAG: --storage-driver-secure="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485899 4740 flags.go:64] FLAG: --storage-driver-table="stats" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485910 4740 flags.go:64] FLAG: --storage-driver-user="root" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485920 4740 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485930 4740 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485939 4740 flags.go:64] FLAG: --system-cgroups="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485948 4740 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485962 4740 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485971 4740 flags.go:64] FLAG: --tls-cert-file="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485980 4740 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485990 4740 flags.go:64] FLAG: --tls-min-version="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.485999 4740 flags.go:64] FLAG: --tls-private-key-file="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.486007 4740 flags.go:64] FLAG: --topology-manager-policy="none" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.486017 4740 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.486026 4740 flags.go:64] FLAG: --topology-manager-scope="container" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.486035 4740 flags.go:64] FLAG: --v="2" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.486047 4740 flags.go:64] FLAG: --version="false" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.486059 4740 flags.go:64] FLAG: --vmodule="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.486070 4740 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.486079 4740 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486284 4740 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486295 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486305 4740 feature_gate.go:330] unrecognized feature gate: Example Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486315 4740 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486323 4740 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486331 4740 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486340 4740 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486349 4740 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486358 4740 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486366 4740 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486375 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486385 4740 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486394 4740 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486403 4740 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486414 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486422 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486430 4740 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486438 4740 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486446 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486454 4740 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486462 4740 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486470 4740 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486478 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486485 4740 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486494 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486504 4740 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486514 4740 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486523 4740 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486531 4740 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486543 4740 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486552 4740 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486560 4740 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486569 4740 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486576 4740 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486585 4740 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486592 4740 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486600 4740 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486608 4740 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486620 4740 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486631 4740 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486641 4740 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486651 4740 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486661 4740 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486670 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486678 4740 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486686 4740 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486694 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486702 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486915 4740 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486924 4740 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486932 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486940 4740 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486947 4740 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486955 4740 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486963 4740 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486971 4740 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486979 4740 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486986 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.486994 4740 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487001 4740 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487010 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487018 4740 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487026 4740 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487033 4740 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487041 4740 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487048 4740 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487057 4740 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487064 4740 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487072 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487080 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.487088 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.487112 4740 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.500857 4740 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.500934 4740 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501073 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501086 4740 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501096 4740 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501105 4740 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501114 4740 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501123 4740 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501131 4740 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501140 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501152 4740 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501163 4740 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501175 4740 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501184 4740 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501193 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501201 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501210 4740 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501218 4740 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501226 4740 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501234 4740 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501242 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501250 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501257 4740 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501266 4740 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501274 4740 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501281 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501290 4740 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501298 4740 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501306 4740 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501315 4740 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501324 4740 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501332 4740 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501342 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501351 4740 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501360 4740 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501368 4740 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501377 4740 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501385 4740 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501393 4740 feature_gate.go:330] unrecognized feature gate: Example Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501401 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501409 4740 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501417 4740 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501424 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501432 4740 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501440 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501452 4740 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501462 4740 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501471 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501480 4740 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501489 4740 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501499 4740 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501509 4740 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501517 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501527 4740 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501536 4740 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501545 4740 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501553 4740 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501561 4740 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501569 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501579 4740 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501586 4740 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501596 4740 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501606 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501618 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501626 4740 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501636 4740 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501644 4740 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501652 4740 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501661 4740 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501669 4740 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501677 4740 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501684 4740 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501693 4740 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.501706 4740 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501977 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.501995 4740 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502005 4740 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502017 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502026 4740 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502034 4740 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502043 4740 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502051 4740 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502059 4740 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502066 4740 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502074 4740 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502082 4740 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502092 4740 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502102 4740 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502110 4740 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502118 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502127 4740 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502134 4740 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502142 4740 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502150 4740 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502157 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502166 4740 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502173 4740 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502181 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502189 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502196 4740 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502204 4740 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502212 4740 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502220 4740 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502228 4740 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502235 4740 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502243 4740 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502251 4740 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502259 4740 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502268 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502275 4740 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502283 4740 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502294 4740 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502304 4740 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502314 4740 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502322 4740 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502330 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502338 4740 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502346 4740 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502354 4740 feature_gate.go:330] unrecognized feature gate: Example Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502361 4740 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502369 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502377 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502385 4740 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502392 4740 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502400 4740 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502407 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502415 4740 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502423 4740 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502432 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502440 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502449 4740 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502456 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502464 4740 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502472 4740 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502482 4740 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502491 4740 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502500 4740 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502508 4740 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502517 4740 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502525 4740 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502532 4740 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502540 4740 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502549 4740 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502558 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.502567 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.502582 4740 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.503799 4740 server.go:940] "Client rotation is on, will bootstrap in background" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.510572 4740 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.510738 4740 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.512716 4740 server.go:997] "Starting client certificate rotation" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.512798 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.514023 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-09 05:42:05.152187409 +0000 UTC Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.514172 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 739h14m23.638021274s for next certificate rotation Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.542551 4740 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.545943 4740 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.568194 4740 log.go:25] "Validated CRI v1 runtime API" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.615144 4740 log.go:25] "Validated CRI v1 image API" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.617849 4740 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.625417 4740 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-09-10-12-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.625480 4740 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.657021 4740 manager.go:217] Machine: {Timestamp:2025-10-09 10:27:41.65290519 +0000 UTC m=+0.615105651 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7223a8fe-fe17-4b87-a3ce-38254af72372 BootID:e6cc4442-9b49-4c7f-99f3-2bf04675ca56 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:58:33:c8 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:58:33:c8 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:bc:4c:2a Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5b:17:27 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2b:a7:be Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5a:1e:99 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5e:a7:04:84:97:e0 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9e:56:b9:15:cc:ee Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.657434 4740 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.657663 4740 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.659150 4740 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.659485 4740 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.659547 4740 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.659947 4740 topology_manager.go:138] "Creating topology manager with none policy" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.659971 4740 container_manager_linux.go:303] "Creating device plugin manager" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.660798 4740 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.660862 4740 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.661179 4740 state_mem.go:36] "Initialized new in-memory state store" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.661319 4740 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.666931 4740 kubelet.go:418] "Attempting to sync node with API server" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.666970 4740 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.667011 4740 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.667033 4740 kubelet.go:324] "Adding apiserver pod source" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.667056 4740 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.671636 4740 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.672889 4740 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.674681 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.674881 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.674845 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.674975 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.676925 4740 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678716 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678834 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678864 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678879 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678906 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678920 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678933 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678954 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678970 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.678985 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.679015 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.679029 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.680235 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.680996 4740 server.go:1280] "Started kubelet" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.681962 4740 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 09 10:27:41 crc systemd[1]: Started Kubernetes Kubelet. Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.681965 4740 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.684164 4740 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.684625 4740 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.684685 4740 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.684893 4740 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 21:42:47.387176906 +0000 UTC Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.684967 4740 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1883h15m5.702216615s for next certificate rotation Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.685013 4740 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.685030 4740 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.685043 4740 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.685186 4740 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.685227 4740 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.685524 4740 server.go:460] "Adding debug handlers to kubelet server" Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.687504 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.687599 4740 factory.go:55] Registering systemd factory Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.687627 4740 factory.go:221] Registration of the systemd container factory successfully Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.687627 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.690023 4740 factory.go:153] Registering CRI-O factory Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.690070 4740 factory.go:221] Registration of the crio container factory successfully Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.690390 4740 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.690489 4740 factory.go:103] Registering Raw factory Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.690512 4740 manager.go:1196] Started watching for new ooms in manager Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.690789 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="200ms" Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.692587 4740 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ccbd30b2b23b1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-09 10:27:41.680935857 +0000 UTC m=+0.643136278,LastTimestamp:2025-10-09 10:27:41.680935857 +0000 UTC m=+0.643136278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.699556 4740 manager.go:319] Starting recovery of all containers Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714313 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714441 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714474 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714518 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714553 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714583 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714611 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714638 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714667 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714692 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714722 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714782 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714814 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714845 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714869 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714891 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714918 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714936 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714958 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.714979 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715003 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715025 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715044 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715066 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715089 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715108 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715139 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715169 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715201 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715229 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715256 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715283 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715367 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715399 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715426 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715453 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715477 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.715503 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.717910 4740 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.717980 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718014 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718046 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718075 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718105 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718133 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718161 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718185 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718255 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718289 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718316 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718382 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718439 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718472 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718509 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718539 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718572 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718602 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718632 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718658 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718725 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718795 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718830 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718856 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718875 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718897 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718918 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718939 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718962 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.718983 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719004 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719026 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719046 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719068 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719089 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719110 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719132 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719153 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719172 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719194 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719215 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719238 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719260 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719281 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719303 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719332 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719359 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719386 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719416 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719441 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719466 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719494 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719519 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719546 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719572 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719599 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719624 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719653 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719679 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719740 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719799 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719826 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719856 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719884 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719911 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719938 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.719979 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720013 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720057 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720091 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720124 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720154 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720187 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720218 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720249 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720283 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720312 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720342 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720369 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720394 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720418 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720447 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720474 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720500 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720528 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720554 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720579 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720606 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720632 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720662 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720688 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720715 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720746 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720805 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720834 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720862 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720892 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720921 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720951 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.720980 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721009 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721037 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721069 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721098 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721124 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721168 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721200 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721232 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721261 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721288 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721320 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721346 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721373 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721400 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721427 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721459 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721488 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721514 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721543 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721571 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721601 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721642 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721669 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721695 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721724 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721787 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721821 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721847 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721873 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721899 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721928 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721948 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721967 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.721987 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722008 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722030 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722049 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722074 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722104 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722134 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722162 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722187 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722213 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722241 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722268 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722295 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722326 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722395 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722425 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722451 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722478 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722504 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722532 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722565 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722596 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722622 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722652 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722680 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722830 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722863 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722886 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722906 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722925 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722949 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.722977 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.723002 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.723023 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.723044 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.723063 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.723121 4740 reconstruct.go:97] "Volume reconstruction finished" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.723135 4740 reconciler.go:26] "Reconciler: start to sync state" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.735408 4740 manager.go:324] Recovery completed Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.750476 4740 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.752304 4740 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.752348 4740 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.752379 4740 kubelet.go:2335] "Starting kubelet main sync loop" Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.752434 4740 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.752793 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:41 crc kubenswrapper[4740]: W1009 10:27:41.753218 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.753306 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.754376 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.754410 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.754421 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.755317 4740 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.755344 4740 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.755372 4740 state_mem.go:36] "Initialized new in-memory state store" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.777403 4740 policy_none.go:49] "None policy: Start" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.778182 4740 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.778212 4740 state_mem.go:35] "Initializing new in-memory state store" Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.785211 4740 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.853042 4740 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.855320 4740 manager.go:334] "Starting Device Plugin manager" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.855610 4740 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.855700 4740 server.go:79] "Starting device plugin registration server" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.856196 4740 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.856218 4740 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.856938 4740 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.857119 4740 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.857135 4740 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.868406 4740 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.891637 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="400ms" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.957318 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.958744 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.958826 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.958845 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:41 crc kubenswrapper[4740]: I1009 10:27:41.958914 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 10:27:41 crc kubenswrapper[4740]: E1009 10:27:41.959453 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.053414 4740 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.053939 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.055700 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.055748 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.055792 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.055970 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.056489 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.056686 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.057981 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.058026 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.058043 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.058136 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.058173 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.058191 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.058211 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.058306 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.058353 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.059673 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.059826 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.059904 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.059923 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.059869 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.060019 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.060106 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.060283 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.060380 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.061561 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.061603 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.061620 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.061713 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.061744 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.061784 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.061953 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.062213 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.062378 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.063579 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.063614 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.063629 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.063677 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.063704 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.063716 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.063984 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.064031 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.065142 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.065171 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.065182 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128353 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128413 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128464 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128504 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128535 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128564 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128593 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128632 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128686 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128743 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128817 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128850 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128909 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.128948 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.160286 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.162247 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.162321 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.162343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.162386 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 10:27:42 crc kubenswrapper[4740]: E1009 10:27:42.163060 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.230845 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.230937 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.230992 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231065 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231098 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231129 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231158 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231191 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231200 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231254 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231222 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231284 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231329 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231331 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231340 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231388 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231139 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231460 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231470 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231531 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231566 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231598 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231566 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231607 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231704 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231721 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231629 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231844 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.231915 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: E1009 10:27:42.292438 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="800ms" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.394816 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.406107 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.428086 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.444253 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.452904 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 10:27:42 crc kubenswrapper[4740]: W1009 10:27:42.455103 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4e3254cb9401fa2c215a7be79b4ec442fbce01c64411731c48107b208179b0fa WatchSource:0}: Error finding container 4e3254cb9401fa2c215a7be79b4ec442fbce01c64411731c48107b208179b0fa: Status 404 returned error can't find the container with id 4e3254cb9401fa2c215a7be79b4ec442fbce01c64411731c48107b208179b0fa Oct 09 10:27:42 crc kubenswrapper[4740]: W1009 10:27:42.463518 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f14d805f62dbf662cc3dcb99fef235647ae37beca782d90452bcab63871e2b3a WatchSource:0}: Error finding container f14d805f62dbf662cc3dcb99fef235647ae37beca782d90452bcab63871e2b3a: Status 404 returned error can't find the container with id f14d805f62dbf662cc3dcb99fef235647ae37beca782d90452bcab63871e2b3a Oct 09 10:27:42 crc kubenswrapper[4740]: W1009 10:27:42.479029 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7bf4757af9640f481c1d40e7c536ee27e544164473d65a561f147b5572b20b98 WatchSource:0}: Error finding container 7bf4757af9640f481c1d40e7c536ee27e544164473d65a561f147b5572b20b98: Status 404 returned error can't find the container with id 7bf4757af9640f481c1d40e7c536ee27e544164473d65a561f147b5572b20b98 Oct 09 10:27:42 crc kubenswrapper[4740]: W1009 10:27:42.482431 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-aca19dd7484debc1d49e6ab8cccadc3b2e10af5b1cba4df29ff57d94a39ae26f WatchSource:0}: Error finding container aca19dd7484debc1d49e6ab8cccadc3b2e10af5b1cba4df29ff57d94a39ae26f: Status 404 returned error can't find the container with id aca19dd7484debc1d49e6ab8cccadc3b2e10af5b1cba4df29ff57d94a39ae26f Oct 09 10:27:42 crc kubenswrapper[4740]: W1009 10:27:42.485018 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5e335329bd454f6b80602a7127ff691905a3b490d42d754a31f97f9c3869ba97 WatchSource:0}: Error finding container 5e335329bd454f6b80602a7127ff691905a3b490d42d754a31f97f9c3869ba97: Status 404 returned error can't find the container with id 5e335329bd454f6b80602a7127ff691905a3b490d42d754a31f97f9c3869ba97 Oct 09 10:27:42 crc kubenswrapper[4740]: W1009 10:27:42.539158 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:42 crc kubenswrapper[4740]: E1009 10:27:42.539705 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.563312 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:42 crc kubenswrapper[4740]: W1009 10:27:42.563477 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:42 crc kubenswrapper[4740]: E1009 10:27:42.563593 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.564854 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.564892 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.564902 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.564924 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 10:27:42 crc kubenswrapper[4740]: E1009 10:27:42.565436 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.685408 4740 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.758396 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"aca19dd7484debc1d49e6ab8cccadc3b2e10af5b1cba4df29ff57d94a39ae26f"} Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.760469 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7bf4757af9640f481c1d40e7c536ee27e544164473d65a561f147b5572b20b98"} Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.763229 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f14d805f62dbf662cc3dcb99fef235647ae37beca782d90452bcab63871e2b3a"} Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.764567 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e3254cb9401fa2c215a7be79b4ec442fbce01c64411731c48107b208179b0fa"} Oct 09 10:27:42 crc kubenswrapper[4740]: I1009 10:27:42.766022 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5e335329bd454f6b80602a7127ff691905a3b490d42d754a31f97f9c3869ba97"} Oct 09 10:27:43 crc kubenswrapper[4740]: W1009 10:27:43.082360 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:43 crc kubenswrapper[4740]: E1009 10:27:43.082469 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:43 crc kubenswrapper[4740]: W1009 10:27:43.084654 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:43 crc kubenswrapper[4740]: E1009 10:27:43.084789 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:43 crc kubenswrapper[4740]: E1009 10:27:43.093775 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="1.6s" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.366357 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.368308 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.368350 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.368364 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.368390 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 10:27:43 crc kubenswrapper[4740]: E1009 10:27:43.368811 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.685882 4740 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.769964 4740 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3a4bee512845d03143354b11bffcc4c51727812ca337a7dc072f8d2be365e15f" exitCode=0 Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.770029 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3a4bee512845d03143354b11bffcc4c51727812ca337a7dc072f8d2be365e15f"} Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.770090 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.771241 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.771274 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.771285 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.771797 4740 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="78e27cfcfe86124c9582532bc3cf2decfc91f0c8335bde7bb17ecb03e1425dcd" exitCode=0 Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.771894 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.771956 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"78e27cfcfe86124c9582532bc3cf2decfc91f0c8335bde7bb17ecb03e1425dcd"} Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.773091 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.773120 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.773131 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.777166 4740 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5" exitCode=0 Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.777303 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.777340 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5"} Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.779960 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.780002 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.780049 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.781541 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74"} Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.781605 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7"} Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.781627 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1"} Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.784050 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0" exitCode=0 Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.784103 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0"} Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.784243 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.785388 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.785435 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.785452 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.789659 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.790398 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.790438 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:43 crc kubenswrapper[4740]: I1009 10:27:43.790452 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:44 crc kubenswrapper[4740]: W1009 10:27:44.195470 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:44 crc kubenswrapper[4740]: E1009 10:27:44.195559 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.685456 4740 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Oct 09 10:27:44 crc kubenswrapper[4740]: E1009 10:27:44.694891 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="3.2s" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.789169 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.789226 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.789238 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.789249 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.790038 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.790068 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.790080 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.792669 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.792695 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.793394 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.793427 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.793441 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.798671 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.798712 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.798729 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.798742 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.798770 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.798788 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.799643 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.799673 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.799682 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.800632 4740 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1de0868b4d06d618bb6ab2e0506576bd89242c370ec851376ad2e705b35afd8c" exitCode=0 Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.800703 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1de0868b4d06d618bb6ab2e0506576bd89242c370ec851376ad2e705b35afd8c"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.800818 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.802052 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.802086 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.802098 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.802371 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2c7d87ddbe1b2db67f0c17cedc17e4548dae05e62b6d1d9c2d77794c71439958"} Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.802443 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.803274 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.803288 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.803297 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.969393 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.972250 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.972292 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.972303 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:44 crc kubenswrapper[4740]: I1009 10:27:44.972328 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 10:27:44 crc kubenswrapper[4740]: E1009 10:27:44.972652 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Oct 09 10:27:45 crc kubenswrapper[4740]: E1009 10:27:45.041161 4740 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ccbd30b2b23b1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-09 10:27:41.680935857 +0000 UTC m=+0.643136278,LastTimestamp:2025-10-09 10:27:41.680935857 +0000 UTC m=+0.643136278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.808900 4740 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e49fbad742a4580a3cf20e270f0eac1ce1eab98a7a534be19c218203beaa6ede" exitCode=0 Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.808969 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e49fbad742a4580a3cf20e270f0eac1ce1eab98a7a534be19c218203beaa6ede"} Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.809044 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.809089 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.809109 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.809151 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.809176 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.809055 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.809302 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.810802 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.810858 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.810876 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811485 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811536 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811575 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811648 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811577 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811582 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811731 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811791 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811519 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811740 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811853 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.811937 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.877503 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:45 crc kubenswrapper[4740]: I1009 10:27:45.935376 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.079413 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.223552 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.480547 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.487383 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.819196 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"89b9de64087d4ec3653e8c4245079d45142eea59a6acb5a476596e085965c297"} Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.819711 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.819813 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.819833 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.819849 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.819285 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b45a89de4471557dbd6b81ea50c93e0958a577f9ac9891e288a582d9a77b427b"} Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.819946 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d971640309a1b0cf862e057ee50c0cf38f72ef7b23c39f997404f4913d65c2e7"} Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.819982 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b99422e4af192f6d2d3e6640180e4bdd92921fc1bf980ec9d3f085738a93f989"} Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.824241 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.824320 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.824341 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.824400 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.824427 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.824350 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.825294 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.825329 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:46 crc kubenswrapper[4740]: I1009 10:27:46.825343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.827293 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d15e6c72950ebf93e59870c6d3d520867ec8642ec371a9c5146062200db40033"} Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.827396 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.827402 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.828766 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.828802 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.828814 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.828869 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.828915 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:47 crc kubenswrapper[4740]: I1009 10:27:47.828951 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.028836 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.173080 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.174694 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.174743 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.174782 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.174822 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.409884 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.710123 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.710380 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.710479 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.712823 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.712927 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.712944 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.830288 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.831609 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.831655 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:48 crc kubenswrapper[4740]: I1009 10:27:48.831670 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:49 crc kubenswrapper[4740]: I1009 10:27:49.833142 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:49 crc kubenswrapper[4740]: I1009 10:27:49.834422 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:49 crc kubenswrapper[4740]: I1009 10:27:49.834473 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:49 crc kubenswrapper[4740]: I1009 10:27:49.834485 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:50 crc kubenswrapper[4740]: I1009 10:27:50.665629 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:27:50 crc kubenswrapper[4740]: I1009 10:27:50.666035 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:50 crc kubenswrapper[4740]: I1009 10:27:50.670654 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:50 crc kubenswrapper[4740]: I1009 10:27:50.670706 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:50 crc kubenswrapper[4740]: I1009 10:27:50.670785 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:51 crc kubenswrapper[4740]: E1009 10:27:51.869243 4740 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.611603 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.611968 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.613807 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.613874 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.613887 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.619371 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.848502 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.849830 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.850064 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:54 crc kubenswrapper[4740]: I1009 10:27:54.850089 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:55 crc kubenswrapper[4740]: W1009 10:27:55.639457 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.639563 4740 trace.go:236] Trace[735472461]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 10:27:45.637) (total time: 10001ms): Oct 09 10:27:55 crc kubenswrapper[4740]: Trace[735472461]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:27:55.639) Oct 09 10:27:55 crc kubenswrapper[4740]: Trace[735472461]: [10.001972572s] [10.001972572s] END Oct 09 10:27:55 crc kubenswrapper[4740]: E1009 10:27:55.639588 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 10:27:55 crc kubenswrapper[4740]: W1009 10:27:55.681497 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.681616 4740 trace.go:236] Trace[2144962840]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 10:27:45.679) (total time: 10001ms): Oct 09 10:27:55 crc kubenswrapper[4740]: Trace[2144962840]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:27:55.681) Oct 09 10:27:55 crc kubenswrapper[4740]: Trace[2144962840]: [10.00190281s] [10.00190281s] END Oct 09 10:27:55 crc kubenswrapper[4740]: E1009 10:27:55.681641 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.686240 4740 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.751850 4740 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.751912 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.761451 4740 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.761532 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.882799 4740 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]log ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]etcd ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/generic-apiserver-start-informers ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/priority-and-fairness-filter ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-apiextensions-informers ok Oct 09 10:27:55 crc kubenswrapper[4740]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Oct 09 10:27:55 crc kubenswrapper[4740]: [-]poststarthook/crd-informer-synced failed: reason withheld Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-system-namespaces-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 09 10:27:55 crc kubenswrapper[4740]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 09 10:27:55 crc kubenswrapper[4740]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/bootstrap-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/start-kube-aggregator-informers ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/apiservice-registration-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/apiservice-discovery-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]autoregister-completion ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/apiservice-openapi-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 09 10:27:55 crc kubenswrapper[4740]: livez check failed Oct 09 10:27:55 crc kubenswrapper[4740]: I1009 10:27:55.882862 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:27:57 crc kubenswrapper[4740]: I1009 10:27:57.612252 4740 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 10:27:57 crc kubenswrapper[4740]: I1009 10:27:57.612363 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.059435 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.059964 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.061300 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.061346 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.061354 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.078512 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.857165 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.858512 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.858560 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:27:58 crc kubenswrapper[4740]: I1009 10:27:58.858576 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.044817 4740 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.369319 4740 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 09 10:28:00 crc kubenswrapper[4740]: E1009 10:28:00.759905 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.761055 4740 trace.go:236] Trace[1674722923]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 10:27:48.704) (total time: 12055ms): Oct 09 10:28:00 crc kubenswrapper[4740]: Trace[1674722923]: ---"Objects listed" error: 12055ms (10:28:00.760) Oct 09 10:28:00 crc kubenswrapper[4740]: Trace[1674722923]: [12.055999683s] [12.055999683s] END Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.761368 4740 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.762350 4740 trace.go:236] Trace[836844453]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 10:27:46.098) (total time: 14663ms): Oct 09 10:28:00 crc kubenswrapper[4740]: Trace[836844453]: ---"Objects listed" error: 14663ms (10:28:00.762) Oct 09 10:28:00 crc kubenswrapper[4740]: Trace[836844453]: [14.663276497s] [14.663276497s] END Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.762388 4740 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.762447 4740 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 09 10:28:00 crc kubenswrapper[4740]: E1009 10:28:00.764872 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.881878 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:28:00 crc kubenswrapper[4740]: I1009 10:28:00.885984 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.041515 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.680745 4740 apiserver.go:52] "Watching apiserver" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.685207 4740 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.685640 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc"] Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.686256 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.686332 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.686375 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.686556 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.686857 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.686881 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.687006 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.687054 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.687228 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.689888 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.690220 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.690228 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.691116 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.691394 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.691727 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.692006 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.692348 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.692633 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.699357 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lw8ns"] Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.699653 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lw8ns" Oct 09 10:28:01 crc kubenswrapper[4740]: W1009 10:28:01.703630 4740 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.703685 4740 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 10:28:01 crc kubenswrapper[4740]: W1009 10:28:01.703734 4740 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.703764 4740 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 10:28:01 crc kubenswrapper[4740]: W1009 10:28:01.703806 4740 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.703820 4740 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.724140 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.740134 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.760307 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.783091 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.784586 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4b8lj"] Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.784885 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.785873 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.785910 4740 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.787687 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.788169 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.789147 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.801798 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.818011 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.836341 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.850651 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.864074 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868138 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868179 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868198 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868217 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868233 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868248 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868265 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868281 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868295 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868311 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868325 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868340 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868356 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868408 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868438 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868452 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868466 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868484 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868530 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868545 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868559 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868574 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868567 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868588 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868606 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868621 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868636 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868650 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868667 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868692 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868706 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868719 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868735 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868763 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868774 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868780 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868843 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868870 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868896 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868916 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868938 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868956 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868973 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.868993 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869001 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869009 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869027 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869043 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869059 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869074 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869090 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869105 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869120 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869137 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869178 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869179 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869196 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869208 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869224 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869244 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869258 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869257 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869275 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869295 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869316 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869336 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869338 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869371 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869390 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869409 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869425 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869436 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869441 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869465 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869476 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869486 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869505 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869525 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869503 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869604 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869713 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869768 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869843 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869529 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869906 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869930 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870062 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869971 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870230 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870280 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870303 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870326 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870347 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870365 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870370 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870432 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870461 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870505 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870535 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870566 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870578 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870596 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870607 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870651 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870672 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870676 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870737 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870772 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870792 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870812 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870834 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870859 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870955 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870977 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870995 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871020 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871043 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870826 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.870885 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871051 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871073 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871172 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.875246 4740 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871263 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871271 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871260 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871334 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871364 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871391 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871459 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871484 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871605 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871627 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871702 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.877220 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871842 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871878 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.871892 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.872097 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.872104 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.872950 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.873132 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.873190 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.873244 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.873338 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.873582 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.873811 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.873995 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.874280 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.874450 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.875152 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.875360 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.875519 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.875638 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:28:02.375617602 +0000 UTC m=+21.337817983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.875660 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.875814 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.875885 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.876486 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.876741 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.876812 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.876995 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.877054 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.877090 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.877281 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.877282 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.877714 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.877770 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878166 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878258 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878507 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878586 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878597 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878652 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878680 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878698 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.869729 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878706 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878898 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878933 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878963 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.878983 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879003 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879027 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879057 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879084 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879119 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879147 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879180 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879226 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879254 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879282 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879304 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879329 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879351 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879373 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879401 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879425 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879461 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879487 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879524 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879557 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879576 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879594 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879616 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879641 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879663 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879680 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879696 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879715 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879730 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879764 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879782 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879800 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879815 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879832 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879849 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879881 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.879981 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880004 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880022 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880039 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880058 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880073 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880089 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880104 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880121 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880138 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880155 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880172 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880191 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880214 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880230 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880268 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880285 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880301 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880317 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880334 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880350 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880365 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880382 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880399 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880415 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880437 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880453 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880472 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880489 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880508 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880525 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.880898 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.881103 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.881452 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.882139 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.882490 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.882485 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.882706 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.882874 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.882907 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.883114 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.883176 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.883152 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.883263 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.883776 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.884097 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.884286 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.885870 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.886094 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.886602 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.887062 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.887459 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.888407 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.889005 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.889224 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.889362 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.889482 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.889497 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.889716 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.889868 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.889997 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.890122 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.885893 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.890541 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.890575 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.890764 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.890988 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891122 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891171 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891287 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891356 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891491 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891514 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891602 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891713 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.890831 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.891886 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.882533 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.892564 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.892653 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.893069 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.893217 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894319 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894398 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894354 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894522 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894656 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894707 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894745 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894836 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.894847 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895233 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895311 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895387 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895397 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895500 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895657 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895680 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895727 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.895135 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.896316 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.896495 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.896604 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.896842 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.896883 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897009 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897090 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897156 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897226 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897294 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897358 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897420 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897483 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897548 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897676 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897742 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897843 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897923 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897987 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898062 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898125 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898185 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898246 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898308 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898374 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898439 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898528 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898626 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898698 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/259e1f79-cddc-4d7a-9f18-ead71047d789-serviceca\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898791 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898858 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lprx\" (UniqueName: \"kubernetes.io/projected/259e1f79-cddc-4d7a-9f18-ead71047d789-kube-api-access-2lprx\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898926 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898994 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899068 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899133 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899195 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a4a628b-ac64-4290-b415-92d89a9e7b9f-hosts-file\") pod \"node-resolver-lw8ns\" (UID: \"8a4a628b-ac64-4290-b415-92d89a9e7b9f\") " pod="openshift-dns/node-resolver-lw8ns" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899262 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwxn5\" (UniqueName: \"kubernetes.io/projected/8a4a628b-ac64-4290-b415-92d89a9e7b9f-kube-api-access-pwxn5\") pod \"node-resolver-lw8ns\" (UID: \"8a4a628b-ac64-4290-b415-92d89a9e7b9f\") " pod="openshift-dns/node-resolver-lw8ns" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899333 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899405 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899471 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899539 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/259e1f79-cddc-4d7a-9f18-ead71047d789-host\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899603 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899681 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899765 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899979 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900106 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900168 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900223 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900281 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900334 4740 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900385 4740 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900437 4740 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900488 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900544 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900601 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900667 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900728 4740 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897151 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897161 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897577 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.897604 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898091 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898125 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898128 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898187 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898217 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898369 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.898458 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899197 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.899223 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900735 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.900824 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.901101 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.901169 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.901484 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.901536 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.902474 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:02.40245096 +0000 UTC m=+21.364651341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.901964 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.902118 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.902185 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.902500 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.902694 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.902948 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.902965 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.903347 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.903506 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.903588 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:02.403570211 +0000 UTC m=+21.365770672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.903639 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.903887 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.904118 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.904184 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.907290 4740 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.912432 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.912719 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.912855 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913013 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913120 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913144 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913315 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913543 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913559 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913795 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913840 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913857 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913873 4740 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913894 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913907 4740 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913920 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913934 4740 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913953 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913966 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913980 4740 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.913993 4740 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914016 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914031 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914043 4740 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914058 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914070 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914083 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914095 4740 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914112 4740 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914126 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914140 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914155 4740 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914173 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914186 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914198 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914212 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914229 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914243 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914256 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914273 4740 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914294 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914314 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914327 4740 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914347 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914359 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914372 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914386 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914403 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914415 4740 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914433 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914452 4740 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914464 4740 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914477 4740 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914522 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914539 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914556 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914570 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914583 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914623 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914638 4740 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914651 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914664 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914680 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914692 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914705 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.914710 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914721 4740 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914734 4740 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914801 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914815 4740 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914834 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914847 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914859 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914869 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914890 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914903 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914915 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914927 4740 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914949 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914960 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914972 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.914988 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915000 4740 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915014 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915026 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915042 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915059 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915075 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915087 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915157 4740 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915177 4740 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915189 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915206 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915219 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915231 4740 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915243 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915259 4740 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915271 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915284 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915301 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915317 4740 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915330 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915342 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915355 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915374 4740 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915386 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915401 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915417 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915429 4740 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915444 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915456 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915473 4740 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915485 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915497 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915512 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915533 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915549 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915561 4740 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915574 4740 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915591 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915604 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915616 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915637 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915649 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915661 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915675 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915690 4740 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915702 4740 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915715 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915728 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915745 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915783 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915795 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915960 4740 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915985 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.914739 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.916068 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.916195 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.916252 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.916295 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.917477 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.917655 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:02.417611977 +0000 UTC m=+21.379812378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:01 crc kubenswrapper[4740]: E1009 10:28:01.918049 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:02.418034668 +0000 UTC m=+21.380235049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.918718 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.918793 4740 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.918814 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.918828 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.918869 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.918885 4740 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.918898 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.915864 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.920088 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.920924 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.920943 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.921106 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.921063 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.921091 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.921352 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.921548 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.921797 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.922962 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.922982 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.924893 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.927083 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.927133 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.927819 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.932272 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.938575 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.944290 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.945765 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.952289 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.959852 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.962494 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.972466 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.983871 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:01 crc kubenswrapper[4740]: I1009 10:28:01.998749 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.005442 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.005726 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.016613 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019402 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019478 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019502 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/259e1f79-cddc-4d7a-9f18-ead71047d789-serviceca\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019547 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lprx\" (UniqueName: \"kubernetes.io/projected/259e1f79-cddc-4d7a-9f18-ead71047d789-kube-api-access-2lprx\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019569 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019581 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019621 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a4a628b-ac64-4290-b415-92d89a9e7b9f-hosts-file\") pod \"node-resolver-lw8ns\" (UID: \"8a4a628b-ac64-4290-b415-92d89a9e7b9f\") " pod="openshift-dns/node-resolver-lw8ns" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019655 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwxn5\" (UniqueName: \"kubernetes.io/projected/8a4a628b-ac64-4290-b415-92d89a9e7b9f-kube-api-access-pwxn5\") pod \"node-resolver-lw8ns\" (UID: \"8a4a628b-ac64-4290-b415-92d89a9e7b9f\") " pod="openshift-dns/node-resolver-lw8ns" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019703 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/259e1f79-cddc-4d7a-9f18-ead71047d789-host\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019740 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019778 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019790 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019803 4740 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019814 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019828 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019839 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019851 4740 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019864 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019876 4740 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019884 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/259e1f79-cddc-4d7a-9f18-ead71047d789-host\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019889 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019924 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019937 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019949 4740 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019960 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019972 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019983 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019994 4740 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020005 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020014 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020025 4740 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020037 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020049 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020059 4740 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020068 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020079 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020089 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020100 4740 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.019811 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a4a628b-ac64-4290-b415-92d89a9e7b9f-hosts-file\") pod \"node-resolver-lw8ns\" (UID: \"8a4a628b-ac64-4290-b415-92d89a9e7b9f\") " pod="openshift-dns/node-resolver-lw8ns" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020110 4740 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020147 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020199 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020218 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020238 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020251 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020263 4740 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020277 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020289 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020302 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020326 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020338 4740 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020350 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020361 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020379 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020390 4740 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020402 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020424 4740 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020435 4740 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020481 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020496 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.020605 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/259e1f79-cddc-4d7a-9f18-ead71047d789-serviceca\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.025168 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 10:28:02 crc kubenswrapper[4740]: W1009 10:28:02.028067 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-481b0cd7d012ef78d2cb798bcad1ac4e59b219f3452b5397614c02e8ca3209b7 WatchSource:0}: Error finding container 481b0cd7d012ef78d2cb798bcad1ac4e59b219f3452b5397614c02e8ca3209b7: Status 404 returned error can't find the container with id 481b0cd7d012ef78d2cb798bcad1ac4e59b219f3452b5397614c02e8ca3209b7 Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.028767 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: W1009 10:28:02.035174 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-246c9f921ddedd4a669c0f68460fb0f3b508045b3b14e69ee1b1307bfd9bbe28 WatchSource:0}: Error finding container 246c9f921ddedd4a669c0f68460fb0f3b508045b3b14e69ee1b1307bfd9bbe28: Status 404 returned error can't find the container with id 246c9f921ddedd4a669c0f68460fb0f3b508045b3b14e69ee1b1307bfd9bbe28 Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.037588 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lprx\" (UniqueName: \"kubernetes.io/projected/259e1f79-cddc-4d7a-9f18-ead71047d789-kube-api-access-2lprx\") pod \"node-ca-4b8lj\" (UID: \"259e1f79-cddc-4d7a-9f18-ead71047d789\") " pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.038473 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.044798 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.052083 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.061178 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.068728 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.082322 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.096663 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4b8lj" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.102173 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: W1009 10:28:02.117437 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod259e1f79_cddc_4d7a_9f18_ead71047d789.slice/crio-f5695d47f00382bbdf9150ca7bde0cfc9cf8c728a1003cf4ee11afd7398a2010 WatchSource:0}: Error finding container f5695d47f00382bbdf9150ca7bde0cfc9cf8c728a1003cf4ee11afd7398a2010: Status 404 returned error can't find the container with id f5695d47f00382bbdf9150ca7bde0cfc9cf8c728a1003cf4ee11afd7398a2010 Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.315941 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.423254 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.423364 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.423671 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.423724 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.423773 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.425870 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:28:03.425831237 +0000 UTC m=+22.388031618 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.426025 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.426048 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.426068 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.426109 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:03.426095794 +0000 UTC m=+22.388296175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.426460 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.426543 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.426568 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:03.426546836 +0000 UTC m=+22.388747227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.426606 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:03.426585797 +0000 UTC m=+22.388786178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.427575 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.427597 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.427620 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.427662 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:03.427653976 +0000 UTC m=+22.389854357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.546078 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kdjch"] Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.546390 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mh8cv"] Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.546814 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.547096 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.548962 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.549502 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.549615 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.549872 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.549999 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.550128 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.550207 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.550246 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.550984 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.551255 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.558208 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.568679 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.569506 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.578332 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.585035 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.593340 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.600378 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.608096 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.618618 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.620323 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.624685 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/223b849a-db98-4f56-a649-9e144189950a-proxy-tls\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.624717 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-cnibin\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.624737 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-os-release\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.624772 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.624788 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsrz7\" (UniqueName: \"kubernetes.io/projected/223b849a-db98-4f56-a649-9e144189950a-kube-api-access-zsrz7\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.624882 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/223b849a-db98-4f56-a649-9e144189950a-mcd-auth-proxy-config\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.624933 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-system-cni-dir\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.624954 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/223b849a-db98-4f56-a649-9e144189950a-rootfs\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.625001 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59656140-3a06-40cb-a5f1-ea08e22780e1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.625030 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59656140-3a06-40cb-a5f1-ea08e22780e1-cni-binary-copy\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.625053 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45vcf\" (UniqueName: \"kubernetes.io/projected/59656140-3a06-40cb-a5f1-ea08e22780e1-kube-api-access-45vcf\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.630343 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.630577 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwxn5\" (UniqueName: \"kubernetes.io/projected/8a4a628b-ac64-4290-b415-92d89a9e7b9f-kube-api-access-pwxn5\") pod \"node-resolver-lw8ns\" (UID: \"8a4a628b-ac64-4290-b415-92d89a9e7b9f\") " pod="openshift-dns/node-resolver-lw8ns" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.640036 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.654846 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.666161 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.674148 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.677632 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.683400 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lw8ns" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.686334 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: W1009 10:28:02.694715 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a4a628b_ac64_4290_b415_92d89a9e7b9f.slice/crio-8f0ab0f961e52470df0e8ebfbb08b5351a0da8c8e857279a7d6d2d3c1843b3b8 WatchSource:0}: Error finding container 8f0ab0f961e52470df0e8ebfbb08b5351a0da8c8e857279a7d6d2d3c1843b3b8: Status 404 returned error can't find the container with id 8f0ab0f961e52470df0e8ebfbb08b5351a0da8c8e857279a7d6d2d3c1843b3b8 Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.696026 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.703502 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.711800 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.723937 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.726323 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59656140-3a06-40cb-a5f1-ea08e22780e1-cni-binary-copy\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.726970 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45vcf\" (UniqueName: \"kubernetes.io/projected/59656140-3a06-40cb-a5f1-ea08e22780e1-kube-api-access-45vcf\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727020 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/223b849a-db98-4f56-a649-9e144189950a-proxy-tls\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727030 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59656140-3a06-40cb-a5f1-ea08e22780e1-cni-binary-copy\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727053 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-cnibin\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727077 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-os-release\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727099 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727127 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsrz7\" (UniqueName: \"kubernetes.io/projected/223b849a-db98-4f56-a649-9e144189950a-kube-api-access-zsrz7\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727150 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-cnibin\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727166 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-system-cni-dir\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727198 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/223b849a-db98-4f56-a649-9e144189950a-rootfs\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727229 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/223b849a-db98-4f56-a649-9e144189950a-mcd-auth-proxy-config\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727282 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59656140-3a06-40cb-a5f1-ea08e22780e1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727491 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-os-release\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727709 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/223b849a-db98-4f56-a649-9e144189950a-rootfs\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727873 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727233 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59656140-3a06-40cb-a5f1-ea08e22780e1-system-cni-dir\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.727981 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59656140-3a06-40cb-a5f1-ea08e22780e1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.728427 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/223b849a-db98-4f56-a649-9e144189950a-mcd-auth-proxy-config\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.731717 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/223b849a-db98-4f56-a649-9e144189950a-proxy-tls\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.735419 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.744363 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsrz7\" (UniqueName: \"kubernetes.io/projected/223b849a-db98-4f56-a649-9e144189950a-kube-api-access-zsrz7\") pod \"machine-config-daemon-kdjch\" (UID: \"223b849a-db98-4f56-a649-9e144189950a\") " pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.744570 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45vcf\" (UniqueName: \"kubernetes.io/projected/59656140-3a06-40cb-a5f1-ea08e22780e1-kube-api-access-45vcf\") pod \"multus-additional-cni-plugins-mh8cv\" (UID: \"59656140-3a06-40cb-a5f1-ea08e22780e1\") " pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.746881 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.753322 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:02 crc kubenswrapper[4740]: E1009 10:28:02.753429 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.757104 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.766089 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.858319 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.863324 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.866355 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"246c9f921ddedd4a669c0f68460fb0f3b508045b3b14e69ee1b1307bfd9bbe28"} Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.868013 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a"} Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.868056 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87"} Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.868070 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"481b0cd7d012ef78d2cb798bcad1ac4e59b219f3452b5397614c02e8ca3209b7"} Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.869917 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8"} Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.869945 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a065b1016bb884705458e39d9ab3a2e8cee736923e4b60da14d5452af871bd73"} Oct 09 10:28:02 crc kubenswrapper[4740]: W1009 10:28:02.871827 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59656140_3a06_40cb_a5f1_ea08e22780e1.slice/crio-54474b3133152a1e2f47537ed80f2131e7debc88a7ae61fa8342aeb785e5da5d WatchSource:0}: Error finding container 54474b3133152a1e2f47537ed80f2131e7debc88a7ae61fa8342aeb785e5da5d: Status 404 returned error can't find the container with id 54474b3133152a1e2f47537ed80f2131e7debc88a7ae61fa8342aeb785e5da5d Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.872788 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lw8ns" event={"ID":"8a4a628b-ac64-4290-b415-92d89a9e7b9f","Type":"ContainerStarted","Data":"8f0ab0f961e52470df0e8ebfbb08b5351a0da8c8e857279a7d6d2d3c1843b3b8"} Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.875111 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4b8lj" event={"ID":"259e1f79-cddc-4d7a-9f18-ead71047d789","Type":"ContainerStarted","Data":"d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8"} Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.875177 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4b8lj" event={"ID":"259e1f79-cddc-4d7a-9f18-ead71047d789","Type":"ContainerStarted","Data":"f5695d47f00382bbdf9150ca7bde0cfc9cf8c728a1003cf4ee11afd7398a2010"} Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.879896 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.894455 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.906385 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.916314 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-qrhgt"] Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.916635 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-klnl8"] Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.917114 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qrhgt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.917663 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.920319 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.921023 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.921234 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.921385 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.921405 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.921538 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.921624 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.921679 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.922300 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.927622 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.934440 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.958707 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:02 crc kubenswrapper[4740]: I1009 10:28:02.992498 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.029887 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-cni-multus\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.029950 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-etc-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.029985 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/192f5d73-ad53-4674-8c35-c72343c6022e-ovn-node-metrics-cert\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030015 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gsjm\" (UniqueName: \"kubernetes.io/projected/192f5d73-ad53-4674-8c35-c72343c6022e-kube-api-access-6gsjm\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030421 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-kubelet\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030462 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-ovn\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030501 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-socket-dir-parent\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030517 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-slash\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030532 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-log-socket\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030548 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-k8s-cni-cncf-io\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030562 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-hostroot\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030581 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030609 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-cnibin\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030624 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-netd\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030641 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-config\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030661 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-os-release\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030675 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-netns\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030699 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-conf-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030716 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-ovn-kubernetes\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030731 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-node-log\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030767 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-bin\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030791 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-env-overrides\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030818 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-netns\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030833 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-var-lib-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030851 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-cni-binary-copy\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030867 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-daemon-config\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030885 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-systemd\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030902 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-cni-bin\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030919 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-systemd-units\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030941 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-script-lib\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030958 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-system-cni-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030973 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-cni-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.030989 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-etc-kubernetes\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.031007 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-kubelet\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.031028 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.031054 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-multus-certs\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.031072 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvssn\" (UniqueName: \"kubernetes.io/projected/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-kube-api-access-zvssn\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.031695 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.072583 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.111201 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131788 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-daemon-config\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131825 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-systemd\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-cni-bin\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131897 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-systemd-units\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131913 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-system-cni-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131946 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-cni-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131961 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-etc-kubernetes\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131976 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-kubelet\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131989 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132013 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-cni-bin\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132022 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-script-lib\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132090 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-multus-certs\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132112 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvssn\" (UniqueName: \"kubernetes.io/projected/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-kube-api-access-zvssn\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132129 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-cni-multus\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132147 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-etc-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/192f5d73-ad53-4674-8c35-c72343c6022e-ovn-node-metrics-cert\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132175 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gsjm\" (UniqueName: \"kubernetes.io/projected/192f5d73-ad53-4674-8c35-c72343c6022e-kube-api-access-6gsjm\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132193 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-ovn\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132248 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-socket-dir-parent\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132272 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-kubelet\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132293 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-slash\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132314 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-log-socket\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132339 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-k8s-cni-cncf-io\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132359 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-hostroot\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132409 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132448 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-cnibin\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132466 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-netd\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132488 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-config\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132512 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-os-release\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132529 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-netns\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132557 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-conf-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132577 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-ovn-kubernetes\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132599 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-bin\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132637 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-node-log\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132657 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-env-overrides\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132689 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-netns\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132710 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-var-lib-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132736 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-cni-binary-copy\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132825 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-daemon-config\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-script-lib\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131952 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-systemd\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132931 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-kubelet\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.132962 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.131988 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-systemd-units\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133023 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-system-cni-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133089 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-etc-kubernetes\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133130 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-hostroot\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133147 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-cni-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133166 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-multus-certs\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133176 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-conf-dir\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133228 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-cni-multus\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133273 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-etc-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133274 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-cni-binary-copy\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133327 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133344 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-node-log\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133349 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-multus-socket-dir-parent\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133299 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-ovn-kubernetes\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133384 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-netns\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133442 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-cnibin\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133444 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-slash\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133463 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-bin\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133475 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-k8s-cni-cncf-io\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133476 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-var-lib-openvswitch\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133495 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-var-lib-kubelet\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133499 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-netd\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133509 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-log-socket\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133628 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-os-release\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133647 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-ovn\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133735 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-host-run-netns\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.133899 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-env-overrides\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.134248 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-config\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.138272 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/192f5d73-ad53-4674-8c35-c72343c6022e-ovn-node-metrics-cert\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.153101 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.176947 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvssn\" (UniqueName: \"kubernetes.io/projected/73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c-kube-api-access-zvssn\") pod \"multus-qrhgt\" (UID: \"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\") " pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.195985 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gsjm\" (UniqueName: \"kubernetes.io/projected/192f5d73-ad53-4674-8c35-c72343c6022e-kube-api-access-6gsjm\") pod \"ovnkube-node-klnl8\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.231593 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.275799 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.284929 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qrhgt" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.292279 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:03 crc kubenswrapper[4740]: W1009 10:28:03.296872 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73e2f602_0e1d_46df_9b13_6bc0ebaf9f0c.slice/crio-beb70ef95b78a54f40bc904d3636233c3495a651242ade63a5b6a8534944c8bb WatchSource:0}: Error finding container beb70ef95b78a54f40bc904d3636233c3495a651242ade63a5b6a8534944c8bb: Status 404 returned error can't find the container with id beb70ef95b78a54f40bc904d3636233c3495a651242ade63a5b6a8534944c8bb Oct 09 10:28:03 crc kubenswrapper[4740]: W1009 10:28:03.305272 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod192f5d73_ad53_4674_8c35_c72343c6022e.slice/crio-1d5b6e39d55af80cad3fa67530110d5e10a599497266b473a9a39109ef93f006 WatchSource:0}: Error finding container 1d5b6e39d55af80cad3fa67530110d5e10a599497266b473a9a39109ef93f006: Status 404 returned error can't find the container with id 1d5b6e39d55af80cad3fa67530110d5e10a599497266b473a9a39109ef93f006 Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.319511 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.350673 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.389100 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.431176 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.435482 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.435560 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.435585 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.435607 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.435625 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435711 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435739 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435774 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:28:05.435726224 +0000 UTC m=+24.397926605 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435802 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:05.435790016 +0000 UTC m=+24.397990397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435783 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435822 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435825 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435863 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:05.435848878 +0000 UTC m=+24.398049249 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435869 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435906 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435919 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.435877 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:05.435870728 +0000 UTC m=+24.398071109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.436002 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:05.435983421 +0000 UTC m=+24.398183852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.471884 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.513777 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.569488 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.591653 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.632034 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.670476 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.710435 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.752959 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.752998 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.753105 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:03 crc kubenswrapper[4740]: E1009 10:28:03.753194 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.756849 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.757702 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.758646 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.759412 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.760187 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.760848 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.761618 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.762385 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.763198 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.765457 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.766270 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.767004 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.767520 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.768036 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.768542 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.769093 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.769639 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.770152 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.770828 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.771460 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.772075 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.772775 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.773321 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.774185 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.774967 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.775693 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.849745 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.850363 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.853022 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.853839 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.854432 4740 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.854617 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.867913 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.868814 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.869322 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.879027 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.881887 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065" exitCode=0 Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.883246 4740 generic.go:334] "Generic (PLEG): container finished" podID="59656140-3a06-40cb-a5f1-ea08e22780e1" containerID="6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e" exitCode=0 Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.900797 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.912460 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.928301 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.938923 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.940593 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.950198 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.951437 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.952369 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.953094 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.953992 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.955243 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.956157 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.957368 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.958043 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.959293 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.960127 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.967420 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.971480 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.972553 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.984355 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.985160 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.986015 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 09 10:28:03 crc kubenswrapper[4740]: I1009 10:28:03.989373 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:03Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.033004 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.042392 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.043075 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.043610 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.043712 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.043790 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"fd35c229d74ebc9c761d917de9457354987fb4d9fc90e464e93ddf560473fbb9"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.043850 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.043940 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"1d5b6e39d55af80cad3fa67530110d5e10a599497266b473a9a39109ef93f006"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.044017 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerDied","Data":"6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.044123 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerStarted","Data":"54474b3133152a1e2f47537ed80f2131e7debc88a7ae61fa8342aeb785e5da5d"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.044263 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qrhgt" event={"ID":"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c","Type":"ContainerStarted","Data":"2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.044349 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qrhgt" event={"ID":"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c","Type":"ContainerStarted","Data":"beb70ef95b78a54f40bc904d3636233c3495a651242ade63a5b6a8534944c8bb"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.044413 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lw8ns" event={"ID":"8a4a628b-ac64-4290-b415-92d89a9e7b9f","Type":"ContainerStarted","Data":"122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.083066 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.113172 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.150286 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.192678 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.239352 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.271694 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.313076 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.356051 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.397054 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.432114 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.469593 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.512604 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.548795 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.592133 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.616617 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.620402 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.637356 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.653262 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.693867 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.734264 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.752805 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:04 crc kubenswrapper[4740]: E1009 10:28:04.753030 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.776114 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.813091 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.853987 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.898383 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.903877 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.903933 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.903946 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.903958 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.906832 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerStarted","Data":"52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.910219 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c"} Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.933394 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:04 crc kubenswrapper[4740]: E1009 10:28:04.946990 4740 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 10:28:04 crc kubenswrapper[4740]: I1009 10:28:04.996864 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:04Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.031047 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.074121 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.113661 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.157358 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.193859 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.231484 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.269131 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.310710 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.354104 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.393854 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.436381 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.453321 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.453455 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453518 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:28:09.453495615 +0000 UTC m=+28.415696006 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453538 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.453630 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.453700 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453733 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.453740 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453774 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453792 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453857 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:09.453823304 +0000 UTC m=+28.416023695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453883 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:09.453872345 +0000 UTC m=+28.416072736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453941 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453941 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453965 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453979 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.453968 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:09.453960558 +0000 UTC m=+28.416160949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.454052 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:09.4540355 +0000 UTC m=+28.416235951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.471564 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.511302 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.553369 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.608109 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.638450 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.671742 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.713164 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.750671 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.753017 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.753085 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.753133 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:05 crc kubenswrapper[4740]: E1009 10:28:05.753224 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.795035 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.830798 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.870171 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.911444 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.914053 4740 generic.go:334] "Generic (PLEG): container finished" podID="59656140-3a06-40cb-a5f1-ea08e22780e1" containerID="52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a" exitCode=0 Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.914121 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerDied","Data":"52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a"} Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.918952 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.918992 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.960481 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:05 crc kubenswrapper[4740]: I1009 10:28:05.992541 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.033286 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.070531 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.112713 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.154088 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.190463 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.232775 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.271287 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.311766 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.354526 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.391453 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.432972 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.470585 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.753133 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:06 crc kubenswrapper[4740]: E1009 10:28:06.753276 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.924623 4740 generic.go:334] "Generic (PLEG): container finished" podID="59656140-3a06-40cb-a5f1-ea08e22780e1" containerID="7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb" exitCode=0 Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.924672 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerDied","Data":"7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb"} Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.942578 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.961462 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.977103 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:06 crc kubenswrapper[4740]: I1009 10:28:06.995995 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:06Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.006412 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.018236 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.031576 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.048812 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.075205 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.088500 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.101665 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.113099 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.125963 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.137209 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.165798 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.168381 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.168415 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.168426 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.168522 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.174366 4740 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.174797 4740 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.176144 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.176174 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.176187 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.176204 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.176215 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: E1009 10:28:07.192024 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.195149 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.195190 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.195202 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.195217 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.195227 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: E1009 10:28:07.207592 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.210843 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.210903 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.210920 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.210947 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.210964 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: E1009 10:28:07.226475 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.229839 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.229874 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.229883 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.229897 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.229909 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: E1009 10:28:07.242345 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.246915 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.246960 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.246973 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.246992 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.247011 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: E1009 10:28:07.265128 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: E1009 10:28:07.265260 4740 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.266900 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.266930 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.266938 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.266951 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.266960 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.369445 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.369491 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.369502 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.369520 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.369532 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.471950 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.472030 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.472042 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.472057 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.472070 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.574016 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.574051 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.574064 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.574081 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.574095 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.676176 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.676210 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.676220 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.676236 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.676247 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.753587 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.753631 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:07 crc kubenswrapper[4740]: E1009 10:28:07.753793 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:07 crc kubenswrapper[4740]: E1009 10:28:07.753896 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.778996 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.779035 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.779047 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.779061 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.779071 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.881121 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.881172 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.881188 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.881209 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.881226 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.931840 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.935210 4740 generic.go:334] "Generic (PLEG): container finished" podID="59656140-3a06-40cb-a5f1-ea08e22780e1" containerID="eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800" exitCode=0 Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.935297 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerDied","Data":"eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.955533 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.973261 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.984028 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.984068 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.984080 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.984097 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.984110 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:07Z","lastTransitionTime":"2025-10-09T10:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:07 crc kubenswrapper[4740]: I1009 10:28:07.994659 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:07Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.011566 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.030308 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.046684 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.058520 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.070018 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.082999 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.086624 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.086662 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.086673 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.086687 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.086697 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.101735 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.114171 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.124465 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.134605 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.146840 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.189022 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.189057 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.189070 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.189087 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.189097 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.292321 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.292386 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.292399 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.292418 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.292430 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.394993 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.395440 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.395561 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.395639 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.395698 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.498829 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.499272 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.499361 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.499526 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.499613 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.602852 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.602899 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.602913 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.602931 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.602944 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.705654 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.705698 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.705706 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.705720 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.705730 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.752702 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:08 crc kubenswrapper[4740]: E1009 10:28:08.753233 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.807994 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.808028 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.808040 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.808055 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.808065 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.910513 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.910564 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.910581 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.910602 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.910622 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:08Z","lastTransitionTime":"2025-10-09T10:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.941302 4740 generic.go:334] "Generic (PLEG): container finished" podID="59656140-3a06-40cb-a5f1-ea08e22780e1" containerID="db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234" exitCode=0 Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.941351 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerDied","Data":"db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234"} Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.965676 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.985905 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:08 crc kubenswrapper[4740]: I1009 10:28:08.998495 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:08Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.009487 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.013459 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.013517 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.013531 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.013549 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.013561 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.024217 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.040936 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.056356 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.071115 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.088272 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.100664 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.115442 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.115937 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.116028 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.116101 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.116164 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.116218 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.127117 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.140135 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.151105 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.220534 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.220583 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.220599 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.220620 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.220634 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.323532 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.323572 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.323581 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.323598 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.323612 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.426374 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.426416 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.426432 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.426461 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.426475 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.489952 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.490040 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.490068 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.490090 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.490140 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490164 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490230 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:28:17.490213626 +0000 UTC m=+36.452414007 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490285 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:17.490276398 +0000 UTC m=+36.452476779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490287 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490312 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490330 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490341 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490386 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:17.49036609 +0000 UTC m=+36.452566511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490382 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490446 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490462 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:17.490435682 +0000 UTC m=+36.452636063 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490473 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.490534 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:17.490517964 +0000 UTC m=+36.452718385 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.528621 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.528692 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.528706 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.528723 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.528736 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.631440 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.631472 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.631481 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.631495 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.631504 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.733375 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.733410 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.733419 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.733435 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.733444 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.752875 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.752890 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.753006 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:09 crc kubenswrapper[4740]: E1009 10:28:09.753112 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.836195 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.836243 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.836252 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.836266 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.836275 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.939324 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.939366 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.939381 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.939398 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.939411 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:09Z","lastTransitionTime":"2025-10-09T10:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.961081 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.961517 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.965941 4740 generic.go:334] "Generic (PLEG): container finished" podID="59656140-3a06-40cb-a5f1-ea08e22780e1" containerID="2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d" exitCode=0 Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.965995 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerDied","Data":"2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d"} Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.976983 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:09 crc kubenswrapper[4740]: I1009 10:28:09.992233 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:09Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.001870 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.008209 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.026947 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.042214 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.042275 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.042292 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.042315 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.042333 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.045685 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.060425 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.073455 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.088663 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.105551 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.118053 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.134585 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.145782 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.145816 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.145825 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.145839 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.145848 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.150856 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.169914 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.187342 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.205598 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.221082 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.232982 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.246964 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.248304 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.248345 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.248360 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.248380 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.248395 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.269888 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.343685 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.350474 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.350525 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.350537 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.350555 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.350566 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.355962 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.366713 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.378864 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.392239 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.402170 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.412829 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.425592 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.438689 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.452728 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.452787 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.452798 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.452812 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.452820 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.554797 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.554844 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.554855 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.554871 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.554906 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.657767 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.657814 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.657827 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.657844 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.657856 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.753217 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:10 crc kubenswrapper[4740]: E1009 10:28:10.753367 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.759957 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.759985 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.759994 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.760006 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.760015 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.862829 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.862880 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.862892 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.862908 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.862920 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.965443 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.965481 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.965490 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.965504 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.965512 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:10Z","lastTransitionTime":"2025-10-09T10:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.973633 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" event={"ID":"59656140-3a06-40cb-a5f1-ea08e22780e1","Type":"ContainerStarted","Data":"bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8"} Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.973735 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.974258 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:10 crc kubenswrapper[4740]: I1009 10:28:10.994032 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:10Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.004037 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.014135 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.028510 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.039533 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.055071 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.067965 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.068013 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.068024 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.068046 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.068060 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.073495 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.088689 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.106033 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.124209 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.140942 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.152118 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.163436 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.170421 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.170459 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.170474 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.170493 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.170507 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.174876 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.189318 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.200154 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.216939 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.235382 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.249368 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.259282 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.272649 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.272690 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.272702 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.272717 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.272729 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.273378 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.288369 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.301876 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.315329 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.330463 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.343860 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.356823 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.373036 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.374888 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.374932 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.374946 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.374965 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.374979 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.385637 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.477331 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.477415 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.477432 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.477460 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.477478 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.579991 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.580055 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.580067 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.580085 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.580097 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.682532 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.682568 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.682578 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.682594 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.682604 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.752949 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:11 crc kubenswrapper[4740]: E1009 10:28:11.753064 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.753462 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:11 crc kubenswrapper[4740]: E1009 10:28:11.753528 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.765166 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.775964 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.784721 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.784790 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.784802 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.784819 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.784832 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.790332 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.801179 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.814002 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.833172 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.845670 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.855181 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.874089 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.885378 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.886792 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.886827 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.886837 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.886852 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.886863 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.896323 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.907401 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.920619 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.944815 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:11Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.976470 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.989496 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.989530 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.989540 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.989553 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:11 crc kubenswrapper[4740]: I1009 10:28:11.989563 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:11Z","lastTransitionTime":"2025-10-09T10:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.092335 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.092371 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.092379 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.092394 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.092404 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.194691 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.194733 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.194745 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.194785 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.194797 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.298030 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.298067 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.298076 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.298089 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.298098 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.400118 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.400150 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.400158 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.400173 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.400183 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.503084 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.503126 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.503138 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.503154 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.503167 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.606140 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.606175 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.606184 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.606197 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.606206 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.707939 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.707992 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.708009 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.708029 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.708043 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.752625 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:12 crc kubenswrapper[4740]: E1009 10:28:12.752813 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.810969 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.811001 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.811008 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.811022 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.811031 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.914108 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.914167 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.914190 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.914218 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.914240 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:12Z","lastTransitionTime":"2025-10-09T10:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.981304 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/0.log" Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.985339 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4" exitCode=1 Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.985449 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4"} Oct 09 10:28:12 crc kubenswrapper[4740]: I1009 10:28:12.986223 4740 scope.go:117] "RemoveContainer" containerID="ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.000674 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:12Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.012632 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.015990 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.016024 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.016034 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.016048 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.016058 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.026783 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.037809 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.049856 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.066858 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.092370 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:12Z\\\",\\\"message\\\":\\\"opping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428301 5979 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428397 5979 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428404 5979 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428455 5979 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428899 5979 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 10:28:12.428922 5979 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 10:28:12.428933 5979 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 10:28:12.428938 5979 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 10:28:12.428963 5979 factory.go:656] Stopping watch factory\\\\nI1009 10:28:12.428977 5979 ovnkube.go:599] Stopped ovnkube\\\\nI1009 10:28:12.428978 5979 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 10:28:12.429015 5979 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 10:28:12.429026 5979 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.104363 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.118398 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.118428 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.118438 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.118452 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.118463 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.119347 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.132565 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.146896 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.164739 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.180447 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.193571 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:13Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.220019 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.220049 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.220057 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.220069 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.220078 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.322226 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.322252 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.322261 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.322274 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.322282 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.424462 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.424510 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.424522 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.424538 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.424551 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.527137 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.527182 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.527196 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.527215 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.527227 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.629743 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.629815 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.629845 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.629870 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.629886 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.732278 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.732315 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.732327 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.732342 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.732355 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.752702 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.752827 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:13 crc kubenswrapper[4740]: E1009 10:28:13.752878 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:13 crc kubenswrapper[4740]: E1009 10:28:13.753060 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.834995 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.835044 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.835055 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.835071 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.835082 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.938263 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.938325 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.938342 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.938365 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.938383 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:13Z","lastTransitionTime":"2025-10-09T10:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.993031 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/1.log" Oct 09 10:28:13 crc kubenswrapper[4740]: I1009 10:28:13.994031 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/0.log" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.000642 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280" exitCode=1 Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.000707 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.000817 4740 scope.go:117] "RemoveContainer" containerID="ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.002116 4740 scope.go:117] "RemoveContainer" containerID="fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280" Oct 09 10:28:14 crc kubenswrapper[4740]: E1009 10:28:14.002390 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.024654 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.039409 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.042067 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.042118 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.042126 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.042160 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.042171 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.059614 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.079675 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.097554 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.118958 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.137949 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.144689 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.144797 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.144857 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.144887 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.144914 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.154645 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.172597 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.188451 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.212613 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:12Z\\\",\\\"message\\\":\\\"opping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428301 5979 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428397 5979 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428404 5979 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428455 5979 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428899 5979 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 10:28:12.428922 5979 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 10:28:12.428933 5979 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 10:28:12.428938 5979 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 10:28:12.428963 5979 factory.go:656] Stopping watch factory\\\\nI1009 10:28:12.428977 5979 ovnkube.go:599] Stopped ovnkube\\\\nI1009 10:28:12.428978 5979 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 10:28:12.429015 5979 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 10:28:12.429026 5979 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.227126 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.240032 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.246850 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.246884 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.246895 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.246909 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.246920 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.255741 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.349604 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.349675 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.349709 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.349746 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.349819 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.451987 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.452047 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.452064 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.452087 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.452105 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.555236 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.555347 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.555370 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.555400 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.555422 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.659123 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.659176 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.659188 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.659206 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.659219 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.741271 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8"] Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.741885 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.745141 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.746853 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.753581 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:14 crc kubenswrapper[4740]: E1009 10:28:14.753790 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.758940 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.761184 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.761234 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.761248 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.761268 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.761284 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.775458 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.792213 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.804945 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.818004 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.830457 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.848343 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.862378 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.863126 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.863199 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.863230 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.863246 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.863256 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.874373 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.884861 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47870d7b-1faf-4429-81f5-3d0c8b489843-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.884962 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w6fq\" (UniqueName: \"kubernetes.io/projected/47870d7b-1faf-4429-81f5-3d0c8b489843-kube-api-access-4w6fq\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.885027 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47870d7b-1faf-4429-81f5-3d0c8b489843-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.885124 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47870d7b-1faf-4429-81f5-3d0c8b489843-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.894304 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1241154b6b5423f56bdf4b9075c4022f1d125f700f130a02284afd1b59d2e4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:12Z\\\",\\\"message\\\":\\\"opping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428301 5979 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428397 5979 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428404 5979 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428455 5979 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 10:28:12.428899 5979 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 10:28:12.428922 5979 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 10:28:12.428933 5979 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 10:28:12.428938 5979 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 10:28:12.428963 5979 factory.go:656] Stopping watch factory\\\\nI1009 10:28:12.428977 5979 ovnkube.go:599] Stopped ovnkube\\\\nI1009 10:28:12.428978 5979 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 10:28:12.429015 5979 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 10:28:12.429026 5979 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.915731 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.925113 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.935578 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.947382 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.959629 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:14Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.966163 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.966195 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.966203 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.966216 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.966224 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:14Z","lastTransitionTime":"2025-10-09T10:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.985783 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47870d7b-1faf-4429-81f5-3d0c8b489843-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.985846 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47870d7b-1faf-4429-81f5-3d0c8b489843-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.985875 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w6fq\" (UniqueName: \"kubernetes.io/projected/47870d7b-1faf-4429-81f5-3d0c8b489843-kube-api-access-4w6fq\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.985894 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47870d7b-1faf-4429-81f5-3d0c8b489843-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.986583 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47870d7b-1faf-4429-81f5-3d0c8b489843-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.986816 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47870d7b-1faf-4429-81f5-3d0c8b489843-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:14 crc kubenswrapper[4740]: I1009 10:28:14.990606 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47870d7b-1faf-4429-81f5-3d0c8b489843-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.001846 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w6fq\" (UniqueName: \"kubernetes.io/projected/47870d7b-1faf-4429-81f5-3d0c8b489843-kube-api-access-4w6fq\") pod \"ovnkube-control-plane-749d76644c-fjrz8\" (UID: \"47870d7b-1faf-4429-81f5-3d0c8b489843\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.004542 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/1.log" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.007632 4740 scope.go:117] "RemoveContainer" containerID="fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280" Oct 09 10:28:15 crc kubenswrapper[4740]: E1009 10:28:15.007894 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.021263 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.033175 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.046452 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.058784 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.061049 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.068447 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.068499 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.068511 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.068531 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.068542 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.075526 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.092725 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.115052 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.130305 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.142865 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.156349 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.169770 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.171921 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.171962 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.171972 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.171987 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.171999 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.183303 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.194071 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.206415 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.217635 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.274609 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.274649 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.274668 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.274682 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.274985 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.377092 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.377122 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.377134 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.377148 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.377159 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.480304 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.480336 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.480347 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.480361 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.480375 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.583496 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.583541 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.583549 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.583568 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.583578 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.686619 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.686688 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.686712 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.686743 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.686804 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.753122 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.753293 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:15 crc kubenswrapper[4740]: E1009 10:28:15.754167 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:15 crc kubenswrapper[4740]: E1009 10:28:15.754557 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.790221 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.790272 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.790288 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.790311 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.790328 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.850874 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-z74b9"] Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.851551 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:15 crc kubenswrapper[4740]: E1009 10:28:15.851631 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.871788 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.891379 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.892982 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.893132 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.893249 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.893387 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.893509 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.914426 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.930054 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.947049 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.959867 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.977967 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.994198 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flwht\" (UniqueName: \"kubernetes.io/projected/01aecf36-9a78-414c-8078-5c114c1dfa3f-kube-api-access-flwht\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.994268 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.996407 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.996476 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.996499 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.996528 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.996551 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:15Z","lastTransitionTime":"2025-10-09T10:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:15 crc kubenswrapper[4740]: I1009 10:28:15.998433 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:15Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.012385 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" event={"ID":"47870d7b-1faf-4429-81f5-3d0c8b489843","Type":"ContainerStarted","Data":"84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.012459 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" event={"ID":"47870d7b-1faf-4429-81f5-3d0c8b489843","Type":"ContainerStarted","Data":"fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.012479 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" event={"ID":"47870d7b-1faf-4429-81f5-3d0c8b489843","Type":"ContainerStarted","Data":"c9279f7ade4499f28b048af59954c9466dd9359839ac92d0d375436d492c23e6"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.013991 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.032623 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.047992 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.063322 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.082027 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.095674 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.095911 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flwht\" (UniqueName: \"kubernetes.io/projected/01aecf36-9a78-414c-8078-5c114c1dfa3f-kube-api-access-flwht\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:16 crc kubenswrapper[4740]: E1009 10:28:16.095939 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:16 crc kubenswrapper[4740]: E1009 10:28:16.096041 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs podName:01aecf36-9a78-414c-8078-5c114c1dfa3f nodeName:}" failed. No retries permitted until 2025-10-09 10:28:16.596010854 +0000 UTC m=+35.558211275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs") pod "network-metrics-daemon-z74b9" (UID: "01aecf36-9a78-414c-8078-5c114c1dfa3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.100111 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.100170 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.100185 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.100206 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.100221 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.103738 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.126378 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.127064 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flwht\" (UniqueName: \"kubernetes.io/projected/01aecf36-9a78-414c-8078-5c114c1dfa3f-kube-api-access-flwht\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.157132 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.195644 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.201945 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.201989 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.202006 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.202027 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.202042 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.212000 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.223377 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.235366 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.245638 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.256837 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.266281 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.285777 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.298788 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.304610 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.304647 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.304656 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.304669 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.304678 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.311031 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.321079 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.331639 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.344026 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.355221 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.368058 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.383332 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:16Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.407248 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.407293 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.407306 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.407354 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.407367 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.509999 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.510065 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.510089 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.510120 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.510144 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.600741 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:16 crc kubenswrapper[4740]: E1009 10:28:16.600959 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:16 crc kubenswrapper[4740]: E1009 10:28:16.601041 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs podName:01aecf36-9a78-414c-8078-5c114c1dfa3f nodeName:}" failed. No retries permitted until 2025-10-09 10:28:17.601018539 +0000 UTC m=+36.563218950 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs") pod "network-metrics-daemon-z74b9" (UID: "01aecf36-9a78-414c-8078-5c114c1dfa3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.613603 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.613671 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.613730 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.613807 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.613834 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.717097 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.717149 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.717164 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.717186 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.717203 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.753342 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:16 crc kubenswrapper[4740]: E1009 10:28:16.753527 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.820247 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.820285 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.820296 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.820310 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.820320 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.923006 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.923056 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.923067 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.923086 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:16 crc kubenswrapper[4740]: I1009 10:28:16.923098 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:16Z","lastTransitionTime":"2025-10-09T10:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.025391 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.025447 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.025462 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.025485 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.025501 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.129225 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.129269 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.129278 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.129296 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.129305 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.232144 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.232209 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.232232 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.232256 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.232274 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.335101 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.335478 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.335873 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.336242 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.336571 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.440061 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.440095 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.440106 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.440123 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.440134 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.509111 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.509255 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.509300 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.509354 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.509380 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.509520 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.509538 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.509552 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.509609 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:33.509591932 +0000 UTC m=+52.471792323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510027 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:28:33.510015143 +0000 UTC m=+52.472215544 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510064 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510115 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510109 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510184 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510232 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:33.510201288 +0000 UTC m=+52.472401699 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510134 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510265 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:33.51024446 +0000 UTC m=+52.472444881 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.510372 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 10:28:33.510338112 +0000 UTC m=+52.472538493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.542695 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.542732 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.542742 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.542769 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.542780 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.562955 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.562996 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.563006 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.563026 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.563042 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.585170 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:17Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.591272 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.591313 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.591324 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.591341 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.591354 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.604908 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:17Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.608563 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.608618 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.608628 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.608649 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.608663 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.610278 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.610462 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.610539 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs podName:01aecf36-9a78-414c-8078-5c114c1dfa3f nodeName:}" failed. No retries permitted until 2025-10-09 10:28:19.610517335 +0000 UTC m=+38.572717716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs") pod "network-metrics-daemon-z74b9" (UID: "01aecf36-9a78-414c-8078-5c114c1dfa3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.626354 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:17Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.631107 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.631169 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.631180 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.631200 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.631210 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.643927 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:17Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.647531 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.647567 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.647577 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.647591 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.647602 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.658440 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:17Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.658915 4740 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.664885 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.664930 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.664943 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.664961 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.664974 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.754327 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.754525 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.755007 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.755078 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.755402 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:17 crc kubenswrapper[4740]: E1009 10:28:17.755472 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.766779 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.766829 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.766844 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.766864 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.766880 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.870450 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.870516 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.870531 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.870553 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.870565 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.973942 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.973998 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.974015 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.974039 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:17 crc kubenswrapper[4740]: I1009 10:28:17.974058 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:17Z","lastTransitionTime":"2025-10-09T10:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.077638 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.077704 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.077749 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.077804 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.077821 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.182136 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.182172 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.182181 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.182196 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.182206 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.285327 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.285378 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.285387 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.285399 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.285407 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.388166 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.388204 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.388212 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.388224 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.388233 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.491659 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.491707 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.491722 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.491743 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.491782 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.594258 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.594346 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.594369 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.594399 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.594421 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.697432 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.697491 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.697508 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.697533 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.697557 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.753343 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:18 crc kubenswrapper[4740]: E1009 10:28:18.753519 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.801318 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.801363 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.801379 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.801401 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.801420 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.905641 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.905725 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.905745 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.905815 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:18 crc kubenswrapper[4740]: I1009 10:28:18.905846 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:18Z","lastTransitionTime":"2025-10-09T10:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.008641 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.008739 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.008786 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.008808 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.008823 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.112540 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.112598 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.112615 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.112640 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.112660 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.215884 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.215949 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.215967 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.215990 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.216008 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.319815 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.319871 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.319887 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.319914 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.319933 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.423255 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.423313 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.423329 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.423350 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.423371 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.526076 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.526148 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.526171 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.526204 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.526226 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.628520 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.628920 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.628973 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.629002 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.629018 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.631210 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:19 crc kubenswrapper[4740]: E1009 10:28:19.631381 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:19 crc kubenswrapper[4740]: E1009 10:28:19.631483 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs podName:01aecf36-9a78-414c-8078-5c114c1dfa3f nodeName:}" failed. No retries permitted until 2025-10-09 10:28:23.631454251 +0000 UTC m=+42.593654672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs") pod "network-metrics-daemon-z74b9" (UID: "01aecf36-9a78-414c-8078-5c114c1dfa3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.731689 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.731746 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.731807 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.731834 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.731855 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.752999 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.753019 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.753185 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:19 crc kubenswrapper[4740]: E1009 10:28:19.753620 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:19 crc kubenswrapper[4740]: E1009 10:28:19.753856 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:19 crc kubenswrapper[4740]: E1009 10:28:19.754063 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.834814 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.834883 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.834899 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.834917 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.834959 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.938344 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.938394 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.938405 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.938422 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:19 crc kubenswrapper[4740]: I1009 10:28:19.938432 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:19Z","lastTransitionTime":"2025-10-09T10:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.040960 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.041024 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.041044 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.041071 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.041089 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.144364 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.144432 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.144454 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.144483 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.144508 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.247222 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.247252 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.247261 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.247273 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.247282 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.350298 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.350337 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.350349 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.350366 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.350378 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.453526 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.453576 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.453590 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.453610 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.453625 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.556210 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.556264 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.556272 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.556291 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.556308 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.659099 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.659141 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.659150 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.659169 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.659179 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.753254 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:20 crc kubenswrapper[4740]: E1009 10:28:20.753443 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.761653 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.761714 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.761729 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.761768 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.761783 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.864289 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.864352 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.864369 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.864392 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.864409 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.966480 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.966534 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.966554 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.966571 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:20 crc kubenswrapper[4740]: I1009 10:28:20.966612 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:20Z","lastTransitionTime":"2025-10-09T10:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.068684 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.068713 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.068724 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.068741 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.068778 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.172340 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.172423 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.172448 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.172475 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.172503 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.275286 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.275362 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.275384 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.275413 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.275433 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.379164 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.379214 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.379235 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.379257 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.379273 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.482287 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.482494 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.482510 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.482526 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.482537 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.586046 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.586099 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.586115 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.586134 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.586155 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.688735 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.688801 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.688813 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.688833 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.688855 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.752699 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:21 crc kubenswrapper[4740]: E1009 10:28:21.752857 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.753095 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.753156 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:21 crc kubenswrapper[4740]: E1009 10:28:21.753311 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:21 crc kubenswrapper[4740]: E1009 10:28:21.753477 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.767501 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.779237 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.792081 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.792131 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.792159 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.792183 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.792200 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.793205 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.807535 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.829431 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.843386 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.865873 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.884295 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.894292 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.894340 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.894348 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.894359 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.894367 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.902926 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.915363 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.925191 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.934188 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.946506 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.958657 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.967778 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.980462 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:21Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.996041 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.996069 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.996080 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.996093 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:21 crc kubenswrapper[4740]: I1009 10:28:21.996103 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:21Z","lastTransitionTime":"2025-10-09T10:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.099310 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.099378 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.099402 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.099430 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.099451 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.202994 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.203049 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.203069 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.203093 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.203113 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.306414 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.306453 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.306463 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.306480 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.306491 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.409214 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.409263 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.409277 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.409297 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.409313 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.505558 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.506518 4740 scope.go:117] "RemoveContainer" containerID="fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280" Oct 09 10:28:22 crc kubenswrapper[4740]: E1009 10:28:22.506707 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.511834 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.511896 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.511920 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.511951 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.511976 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.615420 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.615489 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.615503 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.615521 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.615536 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.718222 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.718296 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.718317 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.718341 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.718359 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.753694 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:22 crc kubenswrapper[4740]: E1009 10:28:22.753932 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.821922 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.821991 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.822008 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.822045 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.822063 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.926551 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.926684 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.926707 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.926734 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:22 crc kubenswrapper[4740]: I1009 10:28:22.926785 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:22Z","lastTransitionTime":"2025-10-09T10:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.029791 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.029874 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.029898 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.029928 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.029951 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.132700 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.132813 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.132838 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.132870 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.132893 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.236409 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.236475 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.236493 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.236524 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.236548 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.339287 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.339327 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.339335 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.339348 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.339358 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.441829 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.441883 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.441899 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.441923 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.441940 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.545135 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.545215 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.545239 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.545267 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.545285 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.648604 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.648672 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.648691 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.648716 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.648733 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.675943 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:23 crc kubenswrapper[4740]: E1009 10:28:23.676133 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:23 crc kubenswrapper[4740]: E1009 10:28:23.676246 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs podName:01aecf36-9a78-414c-8078-5c114c1dfa3f nodeName:}" failed. No retries permitted until 2025-10-09 10:28:31.676220138 +0000 UTC m=+50.638420559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs") pod "network-metrics-daemon-z74b9" (UID: "01aecf36-9a78-414c-8078-5c114c1dfa3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.752247 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.752292 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.752304 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.752320 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.752331 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.752536 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:23 crc kubenswrapper[4740]: E1009 10:28:23.752628 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.752685 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:23 crc kubenswrapper[4740]: E1009 10:28:23.752739 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.752909 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:23 crc kubenswrapper[4740]: E1009 10:28:23.752983 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.855644 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.855709 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.855729 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.855792 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.855815 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.958577 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.958650 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.958669 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.958690 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:23 crc kubenswrapper[4740]: I1009 10:28:23.958707 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:23Z","lastTransitionTime":"2025-10-09T10:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.062044 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.062094 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.062111 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.062132 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.062148 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.164892 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.164978 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.164997 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.165018 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.165035 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.267948 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.268122 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.268158 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.268188 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.268209 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.371827 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.371916 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.371953 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.371985 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.372006 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.474278 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.474340 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.474355 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.474373 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.474383 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.578236 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.578300 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.578317 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.578340 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.578359 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.681721 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.681850 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.681876 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.681904 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.681924 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.753091 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:24 crc kubenswrapper[4740]: E1009 10:28:24.753273 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.785215 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.785283 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.785307 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.785335 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.785360 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.888934 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.889009 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.889032 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.889092 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.889116 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.991869 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.991940 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.991956 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.991980 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:24 crc kubenswrapper[4740]: I1009 10:28:24.991996 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:24Z","lastTransitionTime":"2025-10-09T10:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.095386 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.095465 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.095500 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.095529 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.095550 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.197887 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.197944 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.197966 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.198012 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.198028 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.300813 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.300868 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.300883 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.300924 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.300939 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.403157 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.403206 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.403222 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.403245 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.403261 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.511374 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.511427 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.511443 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.511464 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.511484 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.613409 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.613443 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.613454 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.613470 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.613482 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.715996 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.716087 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.716111 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.716139 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.716157 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.753629 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.753644 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:25 crc kubenswrapper[4740]: E1009 10:28:25.753854 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.753917 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:25 crc kubenswrapper[4740]: E1009 10:28:25.754059 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:25 crc kubenswrapper[4740]: E1009 10:28:25.754162 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.818945 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.818996 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.819013 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.819038 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.819055 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.922248 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.922296 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.922313 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.922336 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:25 crc kubenswrapper[4740]: I1009 10:28:25.922353 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:25Z","lastTransitionTime":"2025-10-09T10:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.025613 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.025674 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.025697 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.025724 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.025745 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.128393 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.128473 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.128497 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.128532 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.128558 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.231202 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.231263 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.231279 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.231329 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.231365 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.334519 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.334578 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.334593 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.334617 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.334634 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.437749 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.437810 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.437818 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.437833 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.437842 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.540794 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.540848 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.540886 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.540933 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.540956 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.644822 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.644899 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.644918 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.644947 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.644970 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.748130 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.748181 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.748196 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.748221 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.748236 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.753454 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:26 crc kubenswrapper[4740]: E1009 10:28:26.753698 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.850920 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.850988 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.851005 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.851031 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.851049 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.953977 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.954034 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.954050 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.954073 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:26 crc kubenswrapper[4740]: I1009 10:28:26.954090 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:26Z","lastTransitionTime":"2025-10-09T10:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.056711 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.056800 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.056810 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.056830 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.056843 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.159963 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.160048 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.160066 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.160091 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.160106 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.264604 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.265050 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.265115 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.265192 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.265265 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.368658 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.368713 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.368728 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.368786 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.368802 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.472825 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.472870 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.472887 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.472910 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.472928 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.576095 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.576142 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.576155 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.576172 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.576183 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.679196 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.679278 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.679301 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.679332 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.679356 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.753501 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.753563 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.753501 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.753853 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.753993 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.754147 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.781816 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.781888 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.781908 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.781931 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.781948 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.798836 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.799039 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.799204 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.799344 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.799470 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.818810 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:27Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.824303 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.824520 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.824674 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.824888 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.825083 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.844609 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:27Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.852542 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.852593 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.852605 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.852627 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.852639 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.872465 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:27Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.876297 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.876331 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.876339 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.876353 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.876360 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.887599 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:27Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.890940 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.891064 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.891155 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.891228 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.891301 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.908295 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:27Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:27 crc kubenswrapper[4740]: E1009 10:28:27.908502 4740 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.910522 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.910595 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.910611 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.910629 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:27 crc kubenswrapper[4740]: I1009 10:28:27.910639 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:27Z","lastTransitionTime":"2025-10-09T10:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.013467 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.013526 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.013536 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.013554 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.013566 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.117432 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.117503 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.117528 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.117557 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.117579 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.219837 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.219903 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.219916 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.219932 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.219944 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.323187 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.323230 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.323252 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.323279 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.323295 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.426910 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.426971 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.426986 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.427006 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.427021 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.529978 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.530405 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.530593 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.530805 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.530975 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.633537 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.633841 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.633957 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.634057 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.634121 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.737146 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.737229 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.737255 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.737286 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.737311 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.752867 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:28 crc kubenswrapper[4740]: E1009 10:28:28.753148 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.839989 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.840286 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.840585 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.840721 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.840873 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.943710 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.943791 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.943807 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.943829 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:28 crc kubenswrapper[4740]: I1009 10:28:28.943846 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:28Z","lastTransitionTime":"2025-10-09T10:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.046014 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.046068 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.046082 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.046101 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.046115 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.148180 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.148236 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.148247 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.148265 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.148276 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.251004 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.251043 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.251051 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.251065 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.251074 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.353199 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.353222 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.353229 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.353241 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.353249 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.455174 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.455203 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.455212 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.455225 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.455235 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.558295 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.558550 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.558620 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.558684 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.558737 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.661356 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.661386 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.661396 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.661409 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.661420 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.752853 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.753028 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:29 crc kubenswrapper[4740]: E1009 10:28:29.753179 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.753199 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:29 crc kubenswrapper[4740]: E1009 10:28:29.753292 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:29 crc kubenswrapper[4740]: E1009 10:28:29.753289 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.763352 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.763390 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.763400 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.763416 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.763442 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.867046 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.867107 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.867124 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.867147 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.867192 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.969945 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.970011 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.970036 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.970065 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:29 crc kubenswrapper[4740]: I1009 10:28:29.970087 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:29Z","lastTransitionTime":"2025-10-09T10:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.072404 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.072469 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.072490 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.072515 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.072532 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.174899 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.174955 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.174975 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.174998 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.175017 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.277853 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.277929 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.277954 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.277988 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.278024 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.380933 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.380986 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.380997 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.381017 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.381030 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.484237 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.484308 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.484325 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.484348 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.484365 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.587998 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.588041 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.588048 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.588066 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.588075 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.690784 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.690876 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.690893 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.690918 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.690937 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.752824 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:30 crc kubenswrapper[4740]: E1009 10:28:30.752972 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.794371 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.794445 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.794468 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.794498 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.794519 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.897127 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.897170 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.897182 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.897197 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:30 crc kubenswrapper[4740]: I1009 10:28:30.897206 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:30Z","lastTransitionTime":"2025-10-09T10:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.000231 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.000272 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.000281 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.000295 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.000305 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.103277 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.103343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.103364 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.103391 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.103415 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.205907 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.205936 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.205946 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.205961 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.205969 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.308313 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.308381 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.308403 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.308428 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.308446 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.415740 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.415835 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.415853 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.415878 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.415895 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.518589 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.518673 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.518693 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.518723 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.518749 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.621021 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.621104 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.621122 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.621146 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.621163 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.725013 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.725087 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.725106 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.725128 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.725147 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.753654 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:31 crc kubenswrapper[4740]: E1009 10:28:31.753784 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.753655 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.753990 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:31 crc kubenswrapper[4740]: E1009 10:28:31.754113 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:31 crc kubenswrapper[4740]: E1009 10:28:31.754313 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.757791 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:31 crc kubenswrapper[4740]: E1009 10:28:31.758011 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:31 crc kubenswrapper[4740]: E1009 10:28:31.758097 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs podName:01aecf36-9a78-414c-8078-5c114c1dfa3f nodeName:}" failed. No retries permitted until 2025-10-09 10:28:47.758069177 +0000 UTC m=+66.720269598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs") pod "network-metrics-daemon-z74b9" (UID: "01aecf36-9a78-414c-8078-5c114c1dfa3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.774207 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.799164 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.820884 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.828465 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.828569 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.828589 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.828646 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.828665 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.840601 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.857139 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.875884 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.894446 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.909124 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.928494 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.932377 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.932417 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.932431 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.932452 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.932466 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:31Z","lastTransitionTime":"2025-10-09T10:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.952313 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.972249 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.985585 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:31 crc kubenswrapper[4740]: I1009 10:28:31.996681 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:31Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.009819 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:32Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.027914 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:32Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.034921 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.034986 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.035006 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.035032 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.035050 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.050903 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:32Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.138157 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.138226 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.138238 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.138255 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.138268 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.240853 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.240956 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.240975 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.241003 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.241020 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.347857 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.347926 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.348006 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.348064 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.348089 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.450836 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.450903 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.450921 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.450945 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.450962 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.554334 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.554404 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.554427 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.554456 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.554479 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.657729 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.657810 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.657828 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.657851 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.657868 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.753021 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:32 crc kubenswrapper[4740]: E1009 10:28:32.753208 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.761698 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.761748 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.761790 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.761815 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.761835 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.865972 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.866306 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.866448 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.866597 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.866839 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.970513 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.970579 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.970599 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.970630 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:32 crc kubenswrapper[4740]: I1009 10:28:32.970652 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:32Z","lastTransitionTime":"2025-10-09T10:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.073588 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.073626 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.073637 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.073653 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.073665 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.176800 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.176900 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.176919 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.176944 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.176992 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.279844 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.279888 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.279898 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.279914 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.279926 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.383678 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.383781 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.383794 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.383811 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.383825 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.487472 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.487873 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.487892 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.487909 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.487923 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.575324 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.575563 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575588 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:29:05.575552513 +0000 UTC m=+84.537752934 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.575648 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575677 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.575714 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575789 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:29:05.575727918 +0000 UTC m=+84.537928339 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.575821 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575884 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575914 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575950 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575970 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575971 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.576000 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.576019 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.575954 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:29:05.575939633 +0000 UTC m=+84.538140044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.576131 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 10:29:05.576067827 +0000 UTC m=+84.538268218 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.576162 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 10:29:05.576151069 +0000 UTC m=+84.538351470 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.590079 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.590135 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.590151 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.590166 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.590177 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.693397 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.693506 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.693528 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.693552 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.693570 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.753149 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.753193 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.753234 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.753367 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.753492 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:33 crc kubenswrapper[4740]: E1009 10:28:33.753723 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.796462 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.796520 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.796536 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.796560 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.796576 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.900195 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.900259 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.900276 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.900299 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:33 crc kubenswrapper[4740]: I1009 10:28:33.900316 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:33Z","lastTransitionTime":"2025-10-09T10:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.002628 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.002663 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.002671 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.002683 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.002692 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.105875 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.105938 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.105959 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.105991 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.106013 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.209242 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.209309 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.209332 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.209359 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.209381 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.312686 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.312748 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.312814 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.312845 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.312867 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.416004 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.416095 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.416132 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.416169 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.416211 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.519186 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.519246 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.519263 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.519286 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.519303 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.622054 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.622125 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.622149 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.622177 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.622200 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.724876 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.724965 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.724989 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.725024 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.725083 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.753182 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:34 crc kubenswrapper[4740]: E1009 10:28:34.753403 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.828588 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.828644 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.828661 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.828687 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.828705 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.932114 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.932173 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.932184 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.932204 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:34 crc kubenswrapper[4740]: I1009 10:28:34.932226 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:34Z","lastTransitionTime":"2025-10-09T10:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.035306 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.039013 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.039035 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.039066 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.039081 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.143915 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.143977 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.143993 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.144013 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.144029 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.247047 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.247113 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.247150 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.247190 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.247217 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.349669 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.349706 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.349716 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.349739 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.349779 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.452860 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.452907 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.452919 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.452936 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.452949 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.555282 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.555355 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.555379 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.555404 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.555427 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.658623 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.658685 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.658707 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.658734 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.658797 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.753560 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.753702 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:35 crc kubenswrapper[4740]: E1009 10:28:35.753862 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.754054 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:35 crc kubenswrapper[4740]: E1009 10:28:35.754213 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:35 crc kubenswrapper[4740]: E1009 10:28:35.754329 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.762671 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.762794 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.762822 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.763178 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.763245 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.866691 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.866895 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.866919 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.866986 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.867010 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.970475 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.970532 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.970549 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.970573 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:35 crc kubenswrapper[4740]: I1009 10:28:35.970590 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:35Z","lastTransitionTime":"2025-10-09T10:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.073070 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.073124 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.073139 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.073158 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.073172 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.085060 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.099411 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.100048 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.115349 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.140366 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.155564 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.170556 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.175332 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.175372 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.175384 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.175404 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.175417 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.182460 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.198925 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.222256 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.241860 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.256515 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.275476 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.277182 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.277218 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.277229 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.277247 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.277259 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.293206 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.311546 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.328237 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.347026 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.361910 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:36Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.379988 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.380037 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.380052 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.380071 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.380086 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.483811 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.483900 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.483925 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.483957 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.483980 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.587135 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.587380 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.587522 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.587651 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.587876 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.691177 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.691247 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.691272 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.691300 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.691321 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.753172 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:36 crc kubenswrapper[4740]: E1009 10:28:36.753370 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.754722 4740 scope.go:117] "RemoveContainer" containerID="fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.794861 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.794902 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.794917 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.794939 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.794955 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.897832 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.897873 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.897885 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.897901 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:36 crc kubenswrapper[4740]: I1009 10:28:36.897916 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:36Z","lastTransitionTime":"2025-10-09T10:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.000288 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.000324 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.000336 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.000350 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.000360 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.092137 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/1.log" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.094306 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.094862 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.102678 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.102703 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.102712 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.102725 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.102735 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.106005 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.121183 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.137950 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.157691 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.169865 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.185069 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.205430 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.205471 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.205479 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.205493 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.205501 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.216948 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.235471 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.246048 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.254133 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.265681 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.278179 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.289037 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.299388 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.307855 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.307899 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.307912 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.307945 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.307957 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.311186 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.323336 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.335659 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:37Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.410301 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.410332 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.410340 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.410353 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.410362 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.512218 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.512247 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.512257 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.512271 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.512306 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.617066 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.617120 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.617135 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.617161 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.617175 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.720466 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.720547 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.720572 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.720597 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.720614 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.755552 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:37 crc kubenswrapper[4740]: E1009 10:28:37.755721 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.756238 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:37 crc kubenswrapper[4740]: E1009 10:28:37.756299 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.756347 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:37 crc kubenswrapper[4740]: E1009 10:28:37.756395 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.824315 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.824388 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.824425 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.824446 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.824459 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.927298 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.927361 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.927384 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.927413 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:37 crc kubenswrapper[4740]: I1009 10:28:37.927436 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:37Z","lastTransitionTime":"2025-10-09T10:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.031209 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.031256 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.031264 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.031278 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.031287 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.104445 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/2.log" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.105208 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/1.log" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.109218 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd" exitCode=1 Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.109277 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.109324 4740 scope.go:117] "RemoveContainer" containerID="fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.110570 4740 scope.go:117] "RemoveContainer" containerID="5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd" Oct 09 10:28:38 crc kubenswrapper[4740]: E1009 10:28:38.110864 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.129550 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.134792 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.134817 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.134825 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.134838 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.134847 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.145337 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.155318 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.155551 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.155632 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.155710 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.155789 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.162895 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: E1009 10:28:38.168680 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.173459 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.173508 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.173519 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.173534 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.173546 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.179403 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: E1009 10:28:38.191371 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.196561 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.196604 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.196616 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.196634 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.196648 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.198819 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: E1009 10:28:38.210668 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.211116 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.215277 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.215359 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.215443 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.215466 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.215479 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.223116 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: E1009 10:28:38.228000 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.232305 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.232571 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.232659 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.232775 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.232879 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.236385 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: E1009 10:28:38.245370 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: E1009 10:28:38.245673 4740 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.247288 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.247320 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.247329 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.247346 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.247361 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.247679 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.259136 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.271671 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.284878 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.296292 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.308679 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.320448 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.331582 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.349648 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.349686 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.349696 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.349710 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.349720 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.353561 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbc7440d5730d74c3132f6c270f5ecdf394df316a02fc7718344d1d192e8a280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:13Z\\\",\\\"message\\\":\\\"ns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 10:28:13.797610 6124 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1009 10:28:13.797559 6124 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:37Z\\\",\\\"message\\\":\\\"oints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1009 10:28:37.598511 6408 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI1009 10:28:37.598527 6408 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1009 10:28:37.598535 6408 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1009 10:28:37.598558 6408 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1009 10:28:37.598530 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:38Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.453315 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.453393 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.453412 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.453437 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.453455 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.557586 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.557664 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.557688 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.557718 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.557742 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.660588 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.660647 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.660664 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.660691 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.660708 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.753213 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:38 crc kubenswrapper[4740]: E1009 10:28:38.753408 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.763857 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.763907 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.763926 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.764398 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.764501 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.867462 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.867616 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.867711 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.867740 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.867834 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.971538 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.971908 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.972126 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.972379 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:38 crc kubenswrapper[4740]: I1009 10:28:38.972591 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:38Z","lastTransitionTime":"2025-10-09T10:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.075878 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.075940 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.075962 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.075988 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.076009 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.114783 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/2.log" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.118826 4740 scope.go:117] "RemoveContainer" containerID="5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd" Oct 09 10:28:39 crc kubenswrapper[4740]: E1009 10:28:39.120032 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.143148 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:37Z\\\",\\\"message\\\":\\\"oints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1009 10:28:37.598511 6408 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI1009 10:28:37.598527 6408 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1009 10:28:37.598535 6408 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1009 10:28:37.598558 6408 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1009 10:28:37.598530 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.180072 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.180124 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.180135 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.180154 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.180168 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.182542 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.214552 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.228635 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.243235 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.259904 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.275825 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.282145 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.282190 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.282202 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.282218 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.282230 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.289391 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.307489 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.323707 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.343862 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.361592 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.378913 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.385666 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.385710 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.385769 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.385790 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.385806 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.400801 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.419160 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.439264 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.455525 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:39Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.488319 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.488375 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.488392 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.488415 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.488434 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.592372 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.592432 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.592449 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.592473 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.592491 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.695943 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.696003 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.696015 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.696032 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.696047 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.753206 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:39 crc kubenswrapper[4740]: E1009 10:28:39.753366 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.753431 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.753233 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:39 crc kubenswrapper[4740]: E1009 10:28:39.753608 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:39 crc kubenswrapper[4740]: E1009 10:28:39.753689 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.798343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.798410 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.798425 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.798461 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.798477 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.902260 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.902321 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.902338 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.902363 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:39 crc kubenswrapper[4740]: I1009 10:28:39.902380 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:39Z","lastTransitionTime":"2025-10-09T10:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.004556 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.004597 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.004609 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.004623 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.004634 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.107606 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.107664 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.107682 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.107705 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.107722 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.211185 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.211243 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.211266 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.211298 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.211322 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.314191 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.314257 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.314283 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.314314 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.314334 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.428007 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.428070 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.428089 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.428122 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.428140 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.531687 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.531782 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.531794 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.531811 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.531825 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.635067 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.635181 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.635198 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.635222 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.635238 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.738359 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.738410 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.738424 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.738442 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.738455 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.752643 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:40 crc kubenswrapper[4740]: E1009 10:28:40.752821 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.841307 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.841347 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.841355 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.841368 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.841376 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.943846 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.943912 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.943926 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.943945 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:40 crc kubenswrapper[4740]: I1009 10:28:40.943959 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:40Z","lastTransitionTime":"2025-10-09T10:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.046414 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.046538 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.046557 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.046579 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.046596 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.149656 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.149701 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.149712 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.149726 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.149735 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.252111 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.252227 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.252303 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.252336 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.252359 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.355383 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.355508 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.355532 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.355557 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.355576 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.458430 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.458490 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.458508 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.458534 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.458552 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.561484 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.561527 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.561536 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.561551 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.561560 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.664433 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.664497 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.664514 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.664538 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.664556 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.753249 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.753307 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.753303 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:41 crc kubenswrapper[4740]: E1009 10:28:41.753365 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:41 crc kubenswrapper[4740]: E1009 10:28:41.753562 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:41 crc kubenswrapper[4740]: E1009 10:28:41.753646 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.767108 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.767154 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.767166 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.767182 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.767193 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.772152 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.789032 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.809228 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.821411 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.831634 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.845159 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.855979 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.874120 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.874156 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.874169 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.874183 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.874194 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.875684 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.892861 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.911332 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.922143 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.930972 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.942124 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.960680 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.976196 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.976255 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.976267 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.976282 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.976295 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:41Z","lastTransitionTime":"2025-10-09T10:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.981923 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:37Z\\\",\\\"message\\\":\\\"oints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1009 10:28:37.598511 6408 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI1009 10:28:37.598527 6408 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1009 10:28:37.598535 6408 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1009 10:28:37.598558 6408 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1009 10:28:37.598530 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:41 crc kubenswrapper[4740]: I1009 10:28:41.995406 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:41Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.008788 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:42Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.077728 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.077794 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.077812 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.077842 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.077854 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.181030 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.181097 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.181118 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.181145 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.181166 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.284055 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.284138 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.284160 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.284193 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.284213 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.390850 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.390917 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.390933 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.390969 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.390986 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.493333 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.493390 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.493417 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.493441 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.493457 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.596451 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.596498 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.596511 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.596527 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.596538 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.699105 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.699177 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.699202 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.699232 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.699254 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.752600 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:42 crc kubenswrapper[4740]: E1009 10:28:42.752792 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.802235 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.802280 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.802291 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.802309 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.802323 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.904309 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.904342 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.904353 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.904367 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:42 crc kubenswrapper[4740]: I1009 10:28:42.904378 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:42Z","lastTransitionTime":"2025-10-09T10:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.007035 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.007089 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.007097 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.007112 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.007121 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.109036 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.109085 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.109097 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.109114 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.109127 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.212081 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.212146 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.212169 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.212197 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.212217 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.314908 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.314976 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.314998 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.315028 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.315054 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.417319 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.417617 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.417745 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.417934 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.418047 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.521119 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.521383 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.521465 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.521542 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.521599 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.624537 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.624586 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.624602 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.624623 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.624640 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.727617 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.727668 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.727681 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.727699 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.727713 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.753355 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.753440 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:43 crc kubenswrapper[4740]: E1009 10:28:43.753600 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:43 crc kubenswrapper[4740]: E1009 10:28:43.753729 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.753392 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:43 crc kubenswrapper[4740]: E1009 10:28:43.754265 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.831021 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.831070 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.831081 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.831098 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.831110 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.933730 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.933786 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.933794 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.933807 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:43 crc kubenswrapper[4740]: I1009 10:28:43.933815 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:43Z","lastTransitionTime":"2025-10-09T10:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.035593 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.035632 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.035644 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.035659 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.035668 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.143266 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.143298 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.143306 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.143319 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.143329 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.245426 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.245456 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.245464 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.245476 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.245486 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.348352 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.348409 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.348440 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.348467 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.348487 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.451137 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.451185 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.451199 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.451220 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.451235 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.554206 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.554240 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.554250 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.554266 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.554276 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.657606 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.657641 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.657652 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.657667 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.657679 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.753319 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:44 crc kubenswrapper[4740]: E1009 10:28:44.753452 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.760514 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.760538 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.760569 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.760584 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.760594 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.863447 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.863502 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.863523 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.863551 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.863573 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.966251 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.966905 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.966947 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.966971 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:44 crc kubenswrapper[4740]: I1009 10:28:44.966987 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:44Z","lastTransitionTime":"2025-10-09T10:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.070264 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.070329 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.070348 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.070370 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.070389 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.173292 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.173362 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.173392 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.173420 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.173442 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.275661 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.275706 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.275769 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.275789 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.275802 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.377628 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.377683 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.377715 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.377744 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.377806 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.481474 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.481513 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.481525 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.481544 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.481555 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.584246 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.584294 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.584302 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.584318 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.584328 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.686744 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.686803 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.686815 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.686830 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.686842 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.753415 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:45 crc kubenswrapper[4740]: E1009 10:28:45.753570 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.754007 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.754066 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:45 crc kubenswrapper[4740]: E1009 10:28:45.754150 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:45 crc kubenswrapper[4740]: E1009 10:28:45.754232 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.788856 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.788884 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.788891 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.788903 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.788911 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.891435 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.891486 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.891502 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.891527 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.891544 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.994392 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.994446 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.994460 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.994480 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:45 crc kubenswrapper[4740]: I1009 10:28:45.994494 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:45Z","lastTransitionTime":"2025-10-09T10:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.097226 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.097270 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.097279 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.097293 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.097303 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.200710 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.200774 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.200788 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.200802 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.200814 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.303603 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.303652 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.303663 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.303679 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.303691 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.405521 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.405562 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.405575 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.405590 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.405600 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.508164 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.508208 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.508220 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.508235 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.508246 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.611731 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.611812 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.611826 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.611845 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.611858 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.714535 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.714639 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.714671 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.714703 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.714726 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.753258 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:46 crc kubenswrapper[4740]: E1009 10:28:46.753377 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.817545 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.817604 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.817620 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.817642 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.817656 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.921106 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.921156 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.921171 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.921187 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:46 crc kubenswrapper[4740]: I1009 10:28:46.921199 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:46Z","lastTransitionTime":"2025-10-09T10:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.023819 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.023881 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.023897 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.023922 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.023939 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.126344 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.126392 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.126403 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.126421 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.126438 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.229160 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.229228 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.229239 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.229256 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.229271 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.331359 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.331402 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.331413 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.331428 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.331438 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.433714 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.433930 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.433939 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.433974 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.433984 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.536579 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.536620 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.536630 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.536644 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.536657 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.639293 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.639337 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.639350 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.639369 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.639382 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.741202 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.741242 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.741252 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.741269 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.741280 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.752714 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.752800 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:47 crc kubenswrapper[4740]: E1009 10:28:47.752841 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.752920 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:47 crc kubenswrapper[4740]: E1009 10:28:47.752992 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:47 crc kubenswrapper[4740]: E1009 10:28:47.753154 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.831582 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:47 crc kubenswrapper[4740]: E1009 10:28:47.831724 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:47 crc kubenswrapper[4740]: E1009 10:28:47.831872 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs podName:01aecf36-9a78-414c-8078-5c114c1dfa3f nodeName:}" failed. No retries permitted until 2025-10-09 10:29:19.831847884 +0000 UTC m=+98.794048305 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs") pod "network-metrics-daemon-z74b9" (UID: "01aecf36-9a78-414c-8078-5c114c1dfa3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.844264 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.844302 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.844310 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.844323 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.844333 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.946944 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.946998 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.947007 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.947023 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:47 crc kubenswrapper[4740]: I1009 10:28:47.947032 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:47Z","lastTransitionTime":"2025-10-09T10:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.049452 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.049492 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.049526 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.049543 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.049554 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.152101 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.152159 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.152181 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.152208 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.152231 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.254796 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.254827 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.254837 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.254873 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.254882 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.357125 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.357161 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.357187 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.357199 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.357208 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.407951 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.407993 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.408010 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.408026 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.408039 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: E1009 10:28:48.429065 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:48Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.432611 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.432648 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.432657 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.432671 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.432681 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: E1009 10:28:48.444546 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:48Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.447985 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.448045 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.448063 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.448085 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.448101 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: E1009 10:28:48.461060 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:48Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.464101 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.464136 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.464148 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.464164 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.464174 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: E1009 10:28:48.474402 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:48Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.477374 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.477427 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.477441 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.477461 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.477472 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: E1009 10:28:48.487632 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:48Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:48 crc kubenswrapper[4740]: E1009 10:28:48.487777 4740 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.489336 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.489380 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.489389 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.489404 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.489413 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.592079 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.592157 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.592167 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.592179 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.592187 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.694134 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.694170 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.694187 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.694203 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.694214 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.753431 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:48 crc kubenswrapper[4740]: E1009 10:28:48.753536 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.796129 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.796157 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.796164 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.796175 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.796184 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.897844 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.897893 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.897907 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.897923 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:48 crc kubenswrapper[4740]: I1009 10:28:48.897934 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:48Z","lastTransitionTime":"2025-10-09T10:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.000159 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.000189 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.000197 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.000210 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.000222 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.102412 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.102456 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.102470 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.102489 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.102502 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.205004 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.205034 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.205042 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.205054 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.205063 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.307522 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.307553 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.307561 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.307573 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.307583 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.410951 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.411000 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.411009 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.411025 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.411035 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.515229 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.515285 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.515305 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.515332 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.515352 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.619064 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.619127 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.619139 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.619156 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.619168 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.722420 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.722479 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.722498 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.722522 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.722537 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.753567 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.753613 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.753669 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:49 crc kubenswrapper[4740]: E1009 10:28:49.753821 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:49 crc kubenswrapper[4740]: E1009 10:28:49.753940 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:49 crc kubenswrapper[4740]: E1009 10:28:49.754062 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.825576 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.825631 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.825643 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.825680 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.825692 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.954546 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.954599 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.954615 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.954637 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:49 crc kubenswrapper[4740]: I1009 10:28:49.954654 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:49Z","lastTransitionTime":"2025-10-09T10:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.057122 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.057145 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.057153 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.057166 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.057175 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.159144 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.159207 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.159218 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.159233 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.159243 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.262062 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.262659 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.262716 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.262748 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.262798 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.366036 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.366077 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.366085 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.366099 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.366109 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.468977 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.469049 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.469066 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.469134 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.469153 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.572015 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.572083 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.572104 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.572134 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.572158 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.674943 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.674989 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.675001 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.675016 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.675027 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.753183 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:50 crc kubenswrapper[4740]: E1009 10:28:50.753338 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.777810 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.777853 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.777864 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.777880 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.777891 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.880893 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.880930 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.880938 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.880951 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.880960 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.982523 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.982567 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.982575 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.982590 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:50 crc kubenswrapper[4740]: I1009 10:28:50.982600 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:50Z","lastTransitionTime":"2025-10-09T10:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.085343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.085384 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.085395 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.085408 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.085417 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.162130 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/0.log" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.162174 4740 generic.go:334] "Generic (PLEG): container finished" podID="73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c" containerID="2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5" exitCode=1 Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.162205 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qrhgt" event={"ID":"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c","Type":"ContainerDied","Data":"2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.162657 4740 scope.go:117] "RemoveContainer" containerID="2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.174904 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.187958 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.187982 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.187989 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.188001 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.188010 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.188444 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.198662 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.208007 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.220415 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.231308 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.244677 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.257864 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.280800 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:37Z\\\",\\\"message\\\":\\\"oints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1009 10:28:37.598511 6408 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI1009 10:28:37.598527 6408 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1009 10:28:37.598535 6408 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1009 10:28:37.598558 6408 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1009 10:28:37.598530 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.290393 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.290430 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.290443 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.290459 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.290469 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.294856 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.305413 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.316577 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.327113 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.338681 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:50Z\\\",\\\"message\\\":\\\"2025-10-09T10:28:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f\\\\n2025-10-09T10:28:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f to /host/opt/cni/bin/\\\\n2025-10-09T10:28:05Z [verbose] multus-daemon started\\\\n2025-10-09T10:28:05Z [verbose] Readiness Indicator file check\\\\n2025-10-09T10:28:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.349080 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.361631 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.378449 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.392566 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.392598 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.392608 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.392624 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.392635 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.494965 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.495010 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.495026 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.495047 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.495062 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.598167 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.598208 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.598219 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.598239 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.598253 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.700817 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.700856 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.700864 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.700880 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.700893 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.753002 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.753069 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:51 crc kubenswrapper[4740]: E1009 10:28:51.753200 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:51 crc kubenswrapper[4740]: E1009 10:28:51.753258 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.753281 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:51 crc kubenswrapper[4740]: E1009 10:28:51.753495 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.767137 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.781118 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.798470 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.806419 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.806459 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.806471 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.806485 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.806497 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.812854 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.837855 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:37Z\\\",\\\"message\\\":\\\"oints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1009 10:28:37.598511 6408 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI1009 10:28:37.598527 6408 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1009 10:28:37.598535 6408 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1009 10:28:37.598558 6408 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1009 10:28:37.598530 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.850671 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.866588 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.876118 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.886061 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.898068 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:50Z\\\",\\\"message\\\":\\\"2025-10-09T10:28:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f\\\\n2025-10-09T10:28:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f to /host/opt/cni/bin/\\\\n2025-10-09T10:28:05Z [verbose] multus-daemon started\\\\n2025-10-09T10:28:05Z [verbose] Readiness Indicator file check\\\\n2025-10-09T10:28:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.909175 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.909206 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.909214 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.909226 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.909236 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:51Z","lastTransitionTime":"2025-10-09T10:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.909724 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.920181 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.931086 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.946727 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.956778 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.966707 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:51 crc kubenswrapper[4740]: I1009 10:28:51.978185 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:51Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.011916 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.011939 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.011948 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.011959 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.011967 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.113705 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.114022 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.114178 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.114338 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.114497 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.174201 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/0.log" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.174261 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qrhgt" event={"ID":"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c","Type":"ContainerStarted","Data":"5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.187619 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.197900 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.207097 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.216712 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.216790 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.216805 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.216845 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.216860 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.218159 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.230555 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.244174 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.255580 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.267387 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.279646 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.289219 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.299340 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.310620 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:50Z\\\",\\\"message\\\":\\\"2025-10-09T10:28:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f\\\\n2025-10-09T10:28:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f to /host/opt/cni/bin/\\\\n2025-10-09T10:28:05Z [verbose] multus-daemon started\\\\n2025-10-09T10:28:05Z [verbose] Readiness Indicator file check\\\\n2025-10-09T10:28:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.323009 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.323097 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.323120 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.323148 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.323171 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.331357 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:37Z\\\",\\\"message\\\":\\\"oints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1009 10:28:37.598511 6408 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI1009 10:28:37.598527 6408 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1009 10:28:37.598535 6408 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1009 10:28:37.598558 6408 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1009 10:28:37.598530 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.343523 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.354628 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.366682 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.380387 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:52Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.425469 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.425528 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.425548 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.425578 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.425617 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.534018 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.534065 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.534084 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.534107 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.534123 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.636939 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.636984 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.636997 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.637014 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.637024 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.740049 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.740088 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.740096 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.740109 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.740118 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.753521 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:52 crc kubenswrapper[4740]: E1009 10:28:52.753655 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.842125 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.842154 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.842162 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.842175 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.842183 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.944014 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.944070 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.944082 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.944098 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:52 crc kubenswrapper[4740]: I1009 10:28:52.944110 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:52Z","lastTransitionTime":"2025-10-09T10:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.045965 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.046049 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.046079 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.046107 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.046126 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.148700 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.148736 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.148744 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.148778 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.148796 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.250847 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.250890 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.250903 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.250918 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.250930 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.353399 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.353437 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.353448 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.353464 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.353475 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.456195 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.456264 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.456288 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.456317 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.456336 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.558336 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.558371 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.558383 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.558396 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.558405 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.660803 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.660854 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.660867 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.660887 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.660904 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.752951 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:53 crc kubenswrapper[4740]: E1009 10:28:53.753091 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.753176 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.753198 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:53 crc kubenswrapper[4740]: E1009 10:28:53.753634 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:53 crc kubenswrapper[4740]: E1009 10:28:53.753787 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.754098 4740 scope.go:117] "RemoveContainer" containerID="5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd" Oct 09 10:28:53 crc kubenswrapper[4740]: E1009 10:28:53.754304 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.762984 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.763051 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.763068 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.763090 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.763142 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.865638 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.865674 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.865682 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.865696 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.865706 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.968062 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.968099 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.968110 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.968131 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:53 crc kubenswrapper[4740]: I1009 10:28:53.968140 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:53Z","lastTransitionTime":"2025-10-09T10:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.070534 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.070590 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.070603 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.070625 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.070640 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.172568 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.172623 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.172632 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.172647 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.172658 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.275323 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.275375 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.275388 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.275405 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.275417 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.378339 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.378396 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.378423 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.378446 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.378463 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.481650 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.481723 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.481741 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.481798 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.481816 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.585002 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.585078 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.585096 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.585120 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.585138 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.688625 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.688803 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.688826 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.688849 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.688867 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.753050 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:54 crc kubenswrapper[4740]: E1009 10:28:54.753213 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.791100 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.791139 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.791149 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.791161 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.791169 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.893814 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.893838 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.893846 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.893887 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.893897 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.996294 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.996335 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.996344 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.996359 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:54 crc kubenswrapper[4740]: I1009 10:28:54.996368 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:54Z","lastTransitionTime":"2025-10-09T10:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.103563 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.103609 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.103622 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.103642 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.103658 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.206531 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.206571 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.206583 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.206604 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.206617 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.310048 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.310087 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.310098 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.310143 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.310156 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.412876 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.412920 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.412931 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.412947 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.412964 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.514943 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.514990 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.514999 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.515015 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.515026 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.617896 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.617937 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.617948 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.617963 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.617974 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.719846 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.719881 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.719892 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.719907 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.719917 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.754793 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.754892 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:55 crc kubenswrapper[4740]: E1009 10:28:55.755020 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.755098 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:55 crc kubenswrapper[4740]: E1009 10:28:55.755287 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:55 crc kubenswrapper[4740]: E1009 10:28:55.755533 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.822490 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.822565 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.822585 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.822613 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.822632 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.925923 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.925984 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.926004 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.926028 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:55 crc kubenswrapper[4740]: I1009 10:28:55.926047 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:55Z","lastTransitionTime":"2025-10-09T10:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.028918 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.028950 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.028962 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.028997 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.029009 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.131906 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.131947 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.131963 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.131986 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.132006 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.235562 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.235621 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.235640 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.235665 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.235681 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.338684 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.338739 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.338789 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.338813 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.338860 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.441861 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.441936 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.441959 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.441994 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.442018 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.544050 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.544082 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.544092 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.544106 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.544116 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.646976 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.647033 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.647053 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.647083 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.647105 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.750447 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.750484 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.750493 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.750505 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.750514 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.753105 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:56 crc kubenswrapper[4740]: E1009 10:28:56.753340 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.853668 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.853719 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.853736 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.853796 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.853814 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.955916 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.955976 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.955993 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.956019 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:56 crc kubenswrapper[4740]: I1009 10:28:56.956037 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:56Z","lastTransitionTime":"2025-10-09T10:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.059566 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.059639 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.059656 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.059682 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.059698 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.162423 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.162485 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.162503 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.162525 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.162543 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.265049 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.265105 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.265124 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.265149 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.265168 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.368249 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.368294 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.368305 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.368323 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.368334 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.471121 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.471201 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.471219 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.471243 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.471261 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.574378 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.574437 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.574455 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.574477 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.574493 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.677319 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.677368 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.677379 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.677394 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.677404 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.753088 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.753115 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.753383 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:57 crc kubenswrapper[4740]: E1009 10:28:57.753307 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:57 crc kubenswrapper[4740]: E1009 10:28:57.753519 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:57 crc kubenswrapper[4740]: E1009 10:28:57.753660 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.781257 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.781306 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.781325 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.781349 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.781366 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.885234 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.885293 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.885310 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.885328 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.885341 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.988355 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.988424 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.988438 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.988456 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:57 crc kubenswrapper[4740]: I1009 10:28:57.988468 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:57Z","lastTransitionTime":"2025-10-09T10:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.091670 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.091730 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.091781 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.091805 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.091822 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.193027 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.193056 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.193064 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.193075 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.193083 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.296704 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.296810 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.296830 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.296854 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.296872 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.400135 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.400183 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.400200 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.400221 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.400240 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.503566 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.503642 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.503667 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.503699 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.503724 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.607222 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.607278 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.607293 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.607314 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.607333 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.710880 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.710955 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.710992 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.711025 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.711048 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.715808 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.715859 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.715882 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.715908 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.715929 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: E1009 10:28:58.737610 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.742849 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.742901 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.742918 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.742941 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.742958 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.753435 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:28:58 crc kubenswrapper[4740]: E1009 10:28:58.753620 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:28:58 crc kubenswrapper[4740]: E1009 10:28:58.763439 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.767905 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.768030 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.768057 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.768090 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.768115 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: E1009 10:28:58.788049 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.793349 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.793502 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.793530 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.793560 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.793582 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: E1009 10:28:58.813494 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.818186 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.818229 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.818246 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.818269 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.818286 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: E1009 10:28:58.839073 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cc4442-9b49-4c7f-99f3-2bf04675ca56\\\",\\\"systemUUID\\\":\\\"7223a8fe-fe17-4b87-a3ce-38254af72372\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 10:28:58 crc kubenswrapper[4740]: E1009 10:28:58.839292 4740 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.841210 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.841289 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.841314 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.841345 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.841368 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.944393 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.944445 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.944456 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.944474 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:58 crc kubenswrapper[4740]: I1009 10:28:58.944485 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:58Z","lastTransitionTime":"2025-10-09T10:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.047439 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.047487 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.047507 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.047529 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.047545 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.150571 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.150626 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.150638 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.150655 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.150666 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.254054 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.254207 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.254234 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.254328 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.254418 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.358373 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.358458 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.358495 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.358525 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.358545 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.461372 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.461429 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.461440 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.461455 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.461467 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.564947 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.565021 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.565039 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.565061 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.565079 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.667943 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.668014 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.668039 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.668068 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.668090 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.753055 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.753113 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.753207 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:28:59 crc kubenswrapper[4740]: E1009 10:28:59.753341 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:28:59 crc kubenswrapper[4740]: E1009 10:28:59.753570 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:28:59 crc kubenswrapper[4740]: E1009 10:28:59.753936 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.771060 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.771122 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.771143 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.771198 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.771220 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.874605 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.874653 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.874664 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.874679 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.874689 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.976944 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.976972 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.976981 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.976995 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:28:59 crc kubenswrapper[4740]: I1009 10:28:59.977004 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:28:59Z","lastTransitionTime":"2025-10-09T10:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.080606 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.080641 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.080651 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.080665 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.080675 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.182312 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.182352 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.182361 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.182374 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.182384 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.284427 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.284474 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.284484 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.284498 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.284507 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.387073 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.387136 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.387145 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.387159 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.387168 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.490283 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.490812 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.490878 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.490945 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.491029 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.593504 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.593554 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.593569 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.593589 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.593605 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.695491 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.695566 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.695620 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.695641 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.695652 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.752934 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:00 crc kubenswrapper[4740]: E1009 10:29:00.753353 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.765365 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.797871 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.797908 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.797917 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.797929 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.797938 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.900557 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.900594 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.900607 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.900625 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:00 crc kubenswrapper[4740]: I1009 10:29:00.900638 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:00Z","lastTransitionTime":"2025-10-09T10:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.003091 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.003156 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.003166 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.003181 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.003193 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.105882 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.106105 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.106197 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.106282 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.106391 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.209141 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.209684 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.209947 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.210145 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.210339 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.313651 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.313697 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.313709 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.313728 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.313739 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.416200 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.416410 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.416468 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.416566 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.416625 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.518507 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.519049 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.519133 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.519219 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.519314 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.622111 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.622146 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.622176 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.622197 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.622208 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.724779 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.724813 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.724822 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.724837 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.724848 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.752811 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:01 crc kubenswrapper[4740]: E1009 10:29:01.752970 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.753023 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:01 crc kubenswrapper[4740]: E1009 10:29:01.753114 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.753191 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:01 crc kubenswrapper[4740]: E1009 10:29:01.753320 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.766927 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:50Z\\\",\\\"message\\\":\\\"2025-10-09T10:28:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f\\\\n2025-10-09T10:28:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f to /host/opt/cni/bin/\\\\n2025-10-09T10:28:05Z [verbose] multus-daemon started\\\\n2025-10-09T10:28:05Z [verbose] Readiness Indicator file check\\\\n2025-10-09T10:28:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.799691 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:37Z\\\",\\\"message\\\":\\\"oints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1009 10:28:37.598511 6408 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI1009 10:28:37.598527 6408 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1009 10:28:37.598535 6408 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1009 10:28:37.598558 6408 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1009 10:28:37.598530 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.817768 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.828491 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.828525 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.828546 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.828562 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.828572 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.833236 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.849891 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.868257 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.885070 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.895806 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2981e32e-61da-45a1-ac8e-cb5400c80a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7d87ddbe1b2db67f0c17cedc17e4548dae05e62b6d1d9c2d77794c71439958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e27cfcfe86124c9582532bc3cf2decfc91f0c8335bde7bb17ecb03e1425dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e27cfcfe86124c9582532bc3cf2decfc91f0c8335bde7bb17ecb03e1425dcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.909935 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.925918 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.930908 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.930953 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.930969 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.930989 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.931007 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:01Z","lastTransitionTime":"2025-10-09T10:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.945508 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.962270 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.979800 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:01 crc kubenswrapper[4740]: I1009 10:29:01.993481 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.008026 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.019569 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.032946 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.032999 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.033030 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.033053 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.033070 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.033855 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.044892 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.135990 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.136033 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.136043 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.136057 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.136070 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.238650 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.238698 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.238709 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.238725 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.238737 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.340400 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.340471 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.340500 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.340528 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.340544 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.443782 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.443833 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.443842 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.443855 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.443864 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.546832 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.546887 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.546898 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.546915 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.546927 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.649072 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.649137 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.649152 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.649173 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.649188 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.752468 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.752517 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.752533 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.752594 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.752604 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.752611 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: E1009 10:29:02.752813 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.855865 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.855897 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.855905 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.855916 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.855926 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.958482 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.958538 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.958550 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.958569 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:02 crc kubenswrapper[4740]: I1009 10:29:02.958582 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:02Z","lastTransitionTime":"2025-10-09T10:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.061074 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.061110 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.061118 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.061131 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.061140 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.162978 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.163013 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.163023 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.163039 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.163048 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.266571 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.266613 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.266622 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.266639 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.266648 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.369265 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.369343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.369366 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.369398 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.369420 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.472664 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.472720 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.472731 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.472778 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.472822 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.576264 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.576346 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.576375 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.576405 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.576428 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.679913 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.680021 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.680044 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.680076 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.680098 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.753520 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.753559 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.753525 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:03 crc kubenswrapper[4740]: E1009 10:29:03.753682 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:03 crc kubenswrapper[4740]: E1009 10:29:03.753875 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:03 crc kubenswrapper[4740]: E1009 10:29:03.753925 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.783482 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.783556 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.783579 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.783606 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.783624 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.887788 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.887843 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.887851 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.887865 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.887874 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.991227 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.991282 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.991297 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.991319 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:03 crc kubenswrapper[4740]: I1009 10:29:03.991335 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:03Z","lastTransitionTime":"2025-10-09T10:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.094825 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.094893 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.094931 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.094963 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.094984 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.198850 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.198916 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.198939 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.198969 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.198992 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.302113 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.302181 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.302203 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.302225 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.302241 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.405385 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.405443 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.405461 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.405483 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.405515 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.507844 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.507899 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.507908 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.507923 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.507933 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.610663 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.610733 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.610784 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.610810 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.610830 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.713477 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.713533 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.713548 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.713568 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.713582 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.752586 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:04 crc kubenswrapper[4740]: E1009 10:29:04.752721 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.754050 4740 scope.go:117] "RemoveContainer" containerID="5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.815715 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.815800 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.815819 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.815843 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.815859 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.917366 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.917399 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.917410 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.917426 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:04 crc kubenswrapper[4740]: I1009 10:29:04.917437 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:04Z","lastTransitionTime":"2025-10-09T10:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.019998 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.020037 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.020045 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.020059 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.020069 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.123028 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.123096 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.123113 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.123137 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.123154 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.215285 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/2.log" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.217446 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.217863 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.225380 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.225424 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.225437 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.225453 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.225464 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.236599 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59656140-3a06-40cb-a5f1-ea08e22780e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff7a1d6bb326aeda9c95a16b0f56a4096232e1fad83eca05c1a11038b668de8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f423198892028d689f99b5d4f60fe97020f82d46a6e5a511a8c6ce32b13667e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52655cb5ae48b4ab0c81f33641935ef2fadb84b26c80255d774f0c04fb9cee8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7192f844bc6e5d91f40c2d883a20e939c51e323313a5df97b7572109d4385edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef567b02ff4b935faf0dc98cf65ad6728713c691fe7343b81caa77d84bde800\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db9291795f08219143a7a2378ef9debe1a3a2dc55244280bb270386ee22c4234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2921363c0f68c483ad623d6aa1ee121f08acfebce32bbe23fec20dc38819864d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45vcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mh8cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.251114 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2981e32e-61da-45a1-ac8e-cb5400c80a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7d87ddbe1b2db67f0c17cedc17e4548dae05e62b6d1d9c2d77794c71439958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e27cfcfe86124c9582532bc3cf2decfc91f0c8335bde7bb17ecb03e1425dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e27cfcfe86124c9582532bc3cf2decfc91f0c8335bde7bb17ecb03e1425dcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.271459 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.286705 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.298401 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfe14851e81cfa999d8148242e61a4062b60e34a1758fd61912a475086560a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.308904 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.319238 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47870d7b-1faf-4429-81f5-3d0c8b489843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc875051bc1dfc50841bc7e55c02b0d92fe31059e541830612ce459eb1247d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84531844a5e9861188f762135b344d8f89410bcc2acbf0ec8ac93d188b88bbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6fq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjrz8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.328024 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.328059 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.328069 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.328083 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.328092 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.329148 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z74b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01aecf36-9a78-414c-8078-5c114c1dfa3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flwht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z74b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.339826 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d41576-1325-4ee6-a500-553f04a49fa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1849768eaaf777d17780620d4ef4efaab7b6a457df9316ff3417ce33dde57ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe4c3b900e2f70a8bebd9211c6b9c7e81f40948afd377b1cca54193358d78e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00cb29390bff205d55f60a3ccda5712c467c55c2c9ec66e0a3341b81d1b0fd74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4cdaafc97d55f89749a6e057920a94da81524e742aef086788b106b262257e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.349385 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d9545b-0075-4442-ab50-88400a66cbc6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99eb305059d073b23f482d05ece1d61192433362fffd0bc220e2d1ddd21c8943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a9cf35703b8479a9f7662d356465d6a50571a64ac5e106ec44c26e3656f815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f693c5e310d6830896a21d092ec855a8f9a5ea16c9fa82d18f9aa2e5fe6e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c020efcdc9c34156c4ce09e7186644e1e1d9a1cd49a67cc294262ccbf68ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.359913 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c92428f44f6f688c3a43a2ed574a1146e78e8a3648a6b120173d7185a38454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e2ad5bb87d72824cb71abc7a2e8c221c4b6b0428ce51d60367e0e173d0c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.368525 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4b8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"259e1f79-cddc-4d7a-9f18-ead71047d789\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0747d8dbb332e37834c711645b577e7a2e54cc13b62db6dc9eaf0089faf6ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lprx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4b8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.380375 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qrhgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:50Z\\\",\\\"message\\\":\\\"2025-10-09T10:28:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f\\\\n2025-10-09T10:28:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f3f32501-d7ff-47c5-875c-06bdcddb016f to /host/opt/cni/bin/\\\\n2025-10-09T10:28:05Z [verbose] multus-daemon started\\\\n2025-10-09T10:28:05Z [verbose] Readiness Indicator file check\\\\n2025-10-09T10:28:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvssn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qrhgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.396586 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192f5d73-ad53-4674-8c35-c72343c6022e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T10:28:37Z\\\",\\\"message\\\":\\\"oints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1009 10:28:37.598511 6408 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI1009 10:28:37.598527 6408 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1009 10:28:37.598535 6408 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1009 10:28:37.598558 6408 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1009 10:28:37.598530 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gsjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-klnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.409744 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bfb72e-cc4f-451f-a56a-8c1e7eddfdf4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:27:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d5a63f0734339aa780a85a33b4788778ce85a11f95ab0f29574694f6653ac2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab147fb73e7010bae5ab2f70cbe97082d4a1c167df89fe575b1640b5c75ba75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34c3f056bc8166221ac80d31543c8b4eec3362725ace538faaa16d82c7c4f6c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1162ab8a5cb935b2f4e47a3caf1010716d947ce6c6eeaac829751111518efdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505da787e5f031a7e93cce79bf5c52b0971634c70f0b022973493e6ae53bc7e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:27:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c512ef6701a70a1a3edcfd62c5681a585c78cf4c96798d8884310eefe94bbcf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T10:27:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T10:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:27:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.419956 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a1a5599847ba9dda13c1b30a5b25f038a266bc88ac6d3458c1f06cf6c6a2f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.428811 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lw8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a4a628b-ac64-4290-b415-92d89a9e7b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122897641f0297808a026c9ed182996f375cccc1216021a98377f3be6d7283ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwxn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lw8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.430122 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.430167 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.430181 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.430199 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.430210 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.438352 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223b849a-db98-4f56-a649-9e144189950a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T10:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://659aeec0f4002ee42961282396cc37a9454e41b52aae0559cb48516221910e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T10:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zsrz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T10:28:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kdjch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T10:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.532547 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.532580 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.532590 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.532603 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.532611 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.621320 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.621429 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.621464 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.621506 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.621533 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621573 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621656 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621665 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.62163417 +0000 UTC m=+148.583834591 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621672 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621694 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621732 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621780 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621797 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621714 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621833 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.621802514 +0000 UTC m=+148.584002975 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621926 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.621907547 +0000 UTC m=+148.584107958 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.621952 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.621939658 +0000 UTC m=+148.584140179 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.622044 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.62203084 +0000 UTC m=+148.584231251 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.634975 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.635029 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.635065 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.635099 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.635121 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.737663 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.737700 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.737715 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.737729 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.737738 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.753500 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.753505 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.753729 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.753820 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.753869 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:05 crc kubenswrapper[4740]: E1009 10:29:05.754042 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.840446 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.840519 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.840542 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.840571 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.840598 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.943940 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.943986 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.943997 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.944013 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:05 crc kubenswrapper[4740]: I1009 10:29:05.944024 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:05Z","lastTransitionTime":"2025-10-09T10:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.046443 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.046777 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.046788 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.046801 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.046811 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.149334 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.149386 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.149409 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.149438 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.149460 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.221991 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/3.log" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.222640 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/2.log" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.227733 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" exitCode=1 Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.227792 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.227827 4740 scope.go:117] "RemoveContainer" containerID="5c123a636b820a073699dd0a8b045abc795b56138706ca965ec81b57639260dd" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.228746 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:29:06 crc kubenswrapper[4740]: E1009 10:29:06.231690 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.251711 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.251738 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.251776 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.251795 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.251810 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.291795 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mh8cv" podStartSLOduration=65.291778849 podStartE2EDuration="1m5.291778849s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.291476411 +0000 UTC m=+85.253676792" watchObservedRunningTime="2025-10-09 10:29:06.291778849 +0000 UTC m=+85.253979230" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.300666 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.300650554 podStartE2EDuration="6.300650554s" podCreationTimestamp="2025-10-09 10:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.300196092 +0000 UTC m=+85.262396473" watchObservedRunningTime="2025-10-09 10:29:06.300650554 +0000 UTC m=+85.262850935" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.321974 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjrz8" podStartSLOduration=64.321958609 podStartE2EDuration="1m4.321958609s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.31068203 +0000 UTC m=+85.272882411" watchObservedRunningTime="2025-10-09 10:29:06.321958609 +0000 UTC m=+85.284158990" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.353407 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.353450 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.353462 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.353478 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.353492 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.386197 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4b8lj" podStartSLOduration=65.386179121 podStartE2EDuration="1m5.386179121s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.385846472 +0000 UTC m=+85.348046853" watchObservedRunningTime="2025-10-09 10:29:06.386179121 +0000 UTC m=+85.348379502" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.413205 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=30.413183016 podStartE2EDuration="30.413183016s" podCreationTimestamp="2025-10-09 10:28:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.413061633 +0000 UTC m=+85.375262044" watchObservedRunningTime="2025-10-09 10:29:06.413183016 +0000 UTC m=+85.375383397" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.413600 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=62.413585107 podStartE2EDuration="1m2.413585107s" podCreationTimestamp="2025-10-09 10:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.400949462 +0000 UTC m=+85.363149853" watchObservedRunningTime="2025-10-09 10:29:06.413585107 +0000 UTC m=+85.375785488" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.436964 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lw8ns" podStartSLOduration=65.436942496 podStartE2EDuration="1m5.436942496s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.423619153 +0000 UTC m=+85.385819554" watchObservedRunningTime="2025-10-09 10:29:06.436942496 +0000 UTC m=+85.399142897" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.437830 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podStartSLOduration=65.437817959 podStartE2EDuration="1m5.437817959s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.436154625 +0000 UTC m=+85.398355016" watchObservedRunningTime="2025-10-09 10:29:06.437817959 +0000 UTC m=+85.400018360" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.456199 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.456421 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.456518 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.456629 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.456715 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.475889 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qrhgt" podStartSLOduration=65.475871138 podStartE2EDuration="1m5.475871138s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.450907476 +0000 UTC m=+85.413107877" watchObservedRunningTime="2025-10-09 10:29:06.475871138 +0000 UTC m=+85.438071529" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.493075 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=66.493055813 podStartE2EDuration="1m6.493055813s" podCreationTimestamp="2025-10-09 10:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:06.492281333 +0000 UTC m=+85.454481724" watchObservedRunningTime="2025-10-09 10:29:06.493055813 +0000 UTC m=+85.455256194" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.559866 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.559911 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.559920 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.559935 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.559946 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.662270 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.662322 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.662337 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.662357 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.662372 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.752710 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:06 crc kubenswrapper[4740]: E1009 10:29:06.752916 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.764786 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.764840 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.764857 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.764879 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.764894 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.868015 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.868067 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.868088 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.868115 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.868135 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.971787 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.971845 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.971859 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.971882 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:06 crc kubenswrapper[4740]: I1009 10:29:06.971898 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:06Z","lastTransitionTime":"2025-10-09T10:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.076215 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.076311 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.076333 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.076363 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.076382 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.180326 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.180379 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.180390 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.180407 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.180419 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.232909 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/3.log" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.237468 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:29:07 crc kubenswrapper[4740]: E1009 10:29:07.237698 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.282658 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.282697 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.282708 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.282724 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.282733 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.385579 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.385628 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.385640 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.385657 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.385669 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.495702 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.495744 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.495771 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.495787 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.495797 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.598689 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.598741 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.598779 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.598796 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.598809 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.702424 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.702462 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.702472 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.702488 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.702499 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.752699 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.752703 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.752842 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:07 crc kubenswrapper[4740]: E1009 10:29:07.753169 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:07 crc kubenswrapper[4740]: E1009 10:29:07.753330 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:07 crc kubenswrapper[4740]: E1009 10:29:07.753509 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.806026 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.806153 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.806206 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.806262 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.806305 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.912568 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.912610 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.912623 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.912640 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:07 crc kubenswrapper[4740]: I1009 10:29:07.912652 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:07Z","lastTransitionTime":"2025-10-09T10:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.015078 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.015116 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.015127 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.015142 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.015154 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.117280 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.117316 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.117327 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.117343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.117354 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.220020 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.220078 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.220096 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.220116 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.221923 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.325056 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.325130 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.325152 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.325181 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.325205 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.428139 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.428188 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.428202 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.428223 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.428235 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.530971 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.531030 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.531043 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.531061 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.531074 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.633743 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.633859 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.633883 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.633917 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.633937 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.737042 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.737093 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.737110 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.737138 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.737157 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.753229 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:08 crc kubenswrapper[4740]: E1009 10:29:08.753385 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.839702 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.839737 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.839747 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.839777 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.839786 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.937506 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.937544 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.937553 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.937567 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.937577 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T10:29:08Z","lastTransitionTime":"2025-10-09T10:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.980485 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8"] Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.980820 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.983197 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.983342 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.983443 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 09 10:29:08 crc kubenswrapper[4740]: I1009 10:29:08.983510 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.155252 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.155315 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.155350 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.155375 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.155418 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.256248 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.256319 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.256357 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.256379 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.256403 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.256920 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.257006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.257449 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.267785 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.277919 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fabc502-7c74-4840-a0d1-9e2ca1a98d40-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ldkf8\" (UID: \"8fabc502-7c74-4840-a0d1-9e2ca1a98d40\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.291662 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.753250 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.753250 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:09 crc kubenswrapper[4740]: I1009 10:29:09.753383 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:09 crc kubenswrapper[4740]: E1009 10:29:09.753521 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:09 crc kubenswrapper[4740]: E1009 10:29:09.753673 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:09 crc kubenswrapper[4740]: E1009 10:29:09.753828 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:10 crc kubenswrapper[4740]: I1009 10:29:10.247781 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" event={"ID":"8fabc502-7c74-4840-a0d1-9e2ca1a98d40","Type":"ContainerStarted","Data":"762d4b175fc286e19701d1e88f532947075297ff6040d32a68ef5959828f0dc9"} Oct 09 10:29:10 crc kubenswrapper[4740]: I1009 10:29:10.248216 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" event={"ID":"8fabc502-7c74-4840-a0d1-9e2ca1a98d40","Type":"ContainerStarted","Data":"ea6623fb0eb4f9e499271af717f8a6353b9e0cf777a6077433914fdffef91212"} Oct 09 10:29:10 crc kubenswrapper[4740]: I1009 10:29:10.753394 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:10 crc kubenswrapper[4740]: E1009 10:29:10.754310 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:11 crc kubenswrapper[4740]: I1009 10:29:11.753516 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:11 crc kubenswrapper[4740]: E1009 10:29:11.754921 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:11 crc kubenswrapper[4740]: I1009 10:29:11.754980 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:11 crc kubenswrapper[4740]: I1009 10:29:11.754996 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:11 crc kubenswrapper[4740]: E1009 10:29:11.755168 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:11 crc kubenswrapper[4740]: E1009 10:29:11.755307 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:12 crc kubenswrapper[4740]: I1009 10:29:12.752587 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:12 crc kubenswrapper[4740]: E1009 10:29:12.752718 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:13 crc kubenswrapper[4740]: I1009 10:29:13.753196 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:13 crc kubenswrapper[4740]: E1009 10:29:13.753646 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:13 crc kubenswrapper[4740]: I1009 10:29:13.753889 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:13 crc kubenswrapper[4740]: E1009 10:29:13.753955 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:13 crc kubenswrapper[4740]: I1009 10:29:13.754164 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:13 crc kubenswrapper[4740]: E1009 10:29:13.754241 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:14 crc kubenswrapper[4740]: I1009 10:29:14.753217 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:14 crc kubenswrapper[4740]: E1009 10:29:14.753570 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:15 crc kubenswrapper[4740]: I1009 10:29:15.753085 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:15 crc kubenswrapper[4740]: I1009 10:29:15.753084 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:15 crc kubenswrapper[4740]: I1009 10:29:15.753210 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:15 crc kubenswrapper[4740]: E1009 10:29:15.753310 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:15 crc kubenswrapper[4740]: E1009 10:29:15.753402 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:15 crc kubenswrapper[4740]: E1009 10:29:15.753563 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:16 crc kubenswrapper[4740]: I1009 10:29:16.753229 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:16 crc kubenswrapper[4740]: E1009 10:29:16.753400 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:17 crc kubenswrapper[4740]: I1009 10:29:17.752864 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:17 crc kubenswrapper[4740]: I1009 10:29:17.752965 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:17 crc kubenswrapper[4740]: E1009 10:29:17.752982 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:17 crc kubenswrapper[4740]: I1009 10:29:17.753063 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:17 crc kubenswrapper[4740]: E1009 10:29:17.753212 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:17 crc kubenswrapper[4740]: E1009 10:29:17.753309 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:18 crc kubenswrapper[4740]: I1009 10:29:18.753482 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:18 crc kubenswrapper[4740]: E1009 10:29:18.753632 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:19 crc kubenswrapper[4740]: I1009 10:29:19.753347 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:19 crc kubenswrapper[4740]: E1009 10:29:19.753555 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:19 crc kubenswrapper[4740]: I1009 10:29:19.753386 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:19 crc kubenswrapper[4740]: E1009 10:29:19.753661 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:19 crc kubenswrapper[4740]: I1009 10:29:19.753359 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:19 crc kubenswrapper[4740]: E1009 10:29:19.753729 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:19 crc kubenswrapper[4740]: I1009 10:29:19.869105 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:19 crc kubenswrapper[4740]: E1009 10:29:19.869342 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:29:19 crc kubenswrapper[4740]: E1009 10:29:19.869424 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs podName:01aecf36-9a78-414c-8078-5c114c1dfa3f nodeName:}" failed. No retries permitted until 2025-10-09 10:30:23.869402351 +0000 UTC m=+162.831602762 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs") pod "network-metrics-daemon-z74b9" (UID: "01aecf36-9a78-414c-8078-5c114c1dfa3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 10:29:20 crc kubenswrapper[4740]: I1009 10:29:20.753660 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:20 crc kubenswrapper[4740]: E1009 10:29:20.753876 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:21 crc kubenswrapper[4740]: I1009 10:29:21.752663 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:21 crc kubenswrapper[4740]: E1009 10:29:21.753687 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:21 crc kubenswrapper[4740]: I1009 10:29:21.753744 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:21 crc kubenswrapper[4740]: E1009 10:29:21.754064 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:21 crc kubenswrapper[4740]: I1009 10:29:21.754112 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:21 crc kubenswrapper[4740]: E1009 10:29:21.754199 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:21 crc kubenswrapper[4740]: I1009 10:29:21.754995 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:29:21 crc kubenswrapper[4740]: E1009 10:29:21.755176 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:29:22 crc kubenswrapper[4740]: I1009 10:29:22.753102 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:22 crc kubenswrapper[4740]: E1009 10:29:22.753283 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:23 crc kubenswrapper[4740]: I1009 10:29:23.752851 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:23 crc kubenswrapper[4740]: I1009 10:29:23.752926 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:23 crc kubenswrapper[4740]: I1009 10:29:23.752859 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:23 crc kubenswrapper[4740]: E1009 10:29:23.753028 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:23 crc kubenswrapper[4740]: E1009 10:29:23.753194 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:23 crc kubenswrapper[4740]: E1009 10:29:23.753363 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:24 crc kubenswrapper[4740]: I1009 10:29:24.753308 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:24 crc kubenswrapper[4740]: E1009 10:29:24.753561 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:25 crc kubenswrapper[4740]: I1009 10:29:25.752718 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:25 crc kubenswrapper[4740]: I1009 10:29:25.752858 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:25 crc kubenswrapper[4740]: E1009 10:29:25.753000 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:25 crc kubenswrapper[4740]: E1009 10:29:25.753117 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:25 crc kubenswrapper[4740]: I1009 10:29:25.753177 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:25 crc kubenswrapper[4740]: E1009 10:29:25.753236 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:26 crc kubenswrapper[4740]: I1009 10:29:26.753286 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:26 crc kubenswrapper[4740]: E1009 10:29:26.753489 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:26 crc kubenswrapper[4740]: I1009 10:29:26.771311 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ldkf8" podStartSLOduration=85.77129086 podStartE2EDuration="1m25.77129086s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:10.268577468 +0000 UTC m=+89.230777879" watchObservedRunningTime="2025-10-09 10:29:26.77129086 +0000 UTC m=+105.733491251" Oct 09 10:29:26 crc kubenswrapper[4740]: I1009 10:29:26.772120 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 09 10:29:27 crc kubenswrapper[4740]: I1009 10:29:27.752882 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:27 crc kubenswrapper[4740]: I1009 10:29:27.752969 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:27 crc kubenswrapper[4740]: I1009 10:29:27.753116 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:27 crc kubenswrapper[4740]: E1009 10:29:27.753225 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:27 crc kubenswrapper[4740]: E1009 10:29:27.753482 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:27 crc kubenswrapper[4740]: E1009 10:29:27.753692 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:28 crc kubenswrapper[4740]: I1009 10:29:28.753574 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:28 crc kubenswrapper[4740]: E1009 10:29:28.753813 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:29 crc kubenswrapper[4740]: I1009 10:29:29.752849 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:29 crc kubenswrapper[4740]: I1009 10:29:29.752941 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:29 crc kubenswrapper[4740]: I1009 10:29:29.753018 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:29 crc kubenswrapper[4740]: E1009 10:29:29.753013 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:29 crc kubenswrapper[4740]: E1009 10:29:29.753128 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:29 crc kubenswrapper[4740]: E1009 10:29:29.753232 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:30 crc kubenswrapper[4740]: I1009 10:29:30.753338 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:30 crc kubenswrapper[4740]: E1009 10:29:30.753462 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:31 crc kubenswrapper[4740]: I1009 10:29:31.753938 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:31 crc kubenswrapper[4740]: I1009 10:29:31.754086 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:31 crc kubenswrapper[4740]: I1009 10:29:31.753994 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:31 crc kubenswrapper[4740]: E1009 10:29:31.757341 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:31 crc kubenswrapper[4740]: E1009 10:29:31.757716 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:31 crc kubenswrapper[4740]: E1009 10:29:31.758041 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:31 crc kubenswrapper[4740]: I1009 10:29:31.806289 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=5.806266993 podStartE2EDuration="5.806266993s" podCreationTimestamp="2025-10-09 10:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:31.804874286 +0000 UTC m=+110.767074697" watchObservedRunningTime="2025-10-09 10:29:31.806266993 +0000 UTC m=+110.768467404" Oct 09 10:29:32 crc kubenswrapper[4740]: I1009 10:29:32.752722 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:32 crc kubenswrapper[4740]: E1009 10:29:32.752884 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:33 crc kubenswrapper[4740]: I1009 10:29:33.752674 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:33 crc kubenswrapper[4740]: I1009 10:29:33.752742 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:33 crc kubenswrapper[4740]: I1009 10:29:33.752686 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:33 crc kubenswrapper[4740]: E1009 10:29:33.752944 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:33 crc kubenswrapper[4740]: E1009 10:29:33.753229 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:33 crc kubenswrapper[4740]: E1009 10:29:33.753381 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:34 crc kubenswrapper[4740]: I1009 10:29:34.753615 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:34 crc kubenswrapper[4740]: E1009 10:29:34.755158 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:35 crc kubenswrapper[4740]: I1009 10:29:35.753206 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:35 crc kubenswrapper[4740]: I1009 10:29:35.753272 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:35 crc kubenswrapper[4740]: I1009 10:29:35.753218 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:35 crc kubenswrapper[4740]: E1009 10:29:35.753451 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:35 crc kubenswrapper[4740]: E1009 10:29:35.753583 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:35 crc kubenswrapper[4740]: E1009 10:29:35.753727 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:36 crc kubenswrapper[4740]: I1009 10:29:36.753725 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:36 crc kubenswrapper[4740]: E1009 10:29:36.755616 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:36 crc kubenswrapper[4740]: I1009 10:29:36.756618 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:29:36 crc kubenswrapper[4740]: E1009 10:29:36.757321 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-klnl8_openshift-ovn-kubernetes(192f5d73-ad53-4674-8c35-c72343c6022e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.343918 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/1.log" Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.345743 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/0.log" Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.345840 4740 generic.go:334] "Generic (PLEG): container finished" podID="73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c" containerID="5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518" exitCode=1 Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.345914 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qrhgt" event={"ID":"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c","Type":"ContainerDied","Data":"5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518"} Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.346029 4740 scope.go:117] "RemoveContainer" containerID="2aaf51e73ad13447796cef3dc44477a11729ee8ef25330ffe94c49c116cf1be5" Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.346982 4740 scope.go:117] "RemoveContainer" containerID="5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518" Oct 09 10:29:37 crc kubenswrapper[4740]: E1009 10:29:37.347419 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qrhgt_openshift-multus(73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c)\"" pod="openshift-multus/multus-qrhgt" podUID="73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c" Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.752840 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.752843 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:37 crc kubenswrapper[4740]: E1009 10:29:37.753207 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:37 crc kubenswrapper[4740]: I1009 10:29:37.752873 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:37 crc kubenswrapper[4740]: E1009 10:29:37.753287 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:37 crc kubenswrapper[4740]: E1009 10:29:37.753365 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:38 crc kubenswrapper[4740]: I1009 10:29:38.352035 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/1.log" Oct 09 10:29:38 crc kubenswrapper[4740]: I1009 10:29:38.752515 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:38 crc kubenswrapper[4740]: E1009 10:29:38.752676 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:39 crc kubenswrapper[4740]: I1009 10:29:39.752657 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:39 crc kubenswrapper[4740]: I1009 10:29:39.753116 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:39 crc kubenswrapper[4740]: E1009 10:29:39.754200 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:39 crc kubenswrapper[4740]: I1009 10:29:39.753176 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:39 crc kubenswrapper[4740]: E1009 10:29:39.754437 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:39 crc kubenswrapper[4740]: E1009 10:29:39.754450 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:40 crc kubenswrapper[4740]: I1009 10:29:40.753442 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:40 crc kubenswrapper[4740]: E1009 10:29:40.754202 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:41 crc kubenswrapper[4740]: I1009 10:29:41.752793 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:41 crc kubenswrapper[4740]: I1009 10:29:41.752861 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:41 crc kubenswrapper[4740]: E1009 10:29:41.754006 4740 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 09 10:29:41 crc kubenswrapper[4740]: E1009 10:29:41.755239 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:41 crc kubenswrapper[4740]: I1009 10:29:41.756071 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:41 crc kubenswrapper[4740]: E1009 10:29:41.756115 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:41 crc kubenswrapper[4740]: E1009 10:29:41.756354 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:41 crc kubenswrapper[4740]: E1009 10:29:41.885263 4740 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 10:29:42 crc kubenswrapper[4740]: I1009 10:29:42.752980 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:42 crc kubenswrapper[4740]: E1009 10:29:42.753123 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:43 crc kubenswrapper[4740]: I1009 10:29:43.753324 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:43 crc kubenswrapper[4740]: I1009 10:29:43.753395 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:43 crc kubenswrapper[4740]: E1009 10:29:43.753504 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:43 crc kubenswrapper[4740]: E1009 10:29:43.753736 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:43 crc kubenswrapper[4740]: I1009 10:29:43.753971 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:43 crc kubenswrapper[4740]: E1009 10:29:43.754092 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:44 crc kubenswrapper[4740]: I1009 10:29:44.752590 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:44 crc kubenswrapper[4740]: E1009 10:29:44.752788 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:45 crc kubenswrapper[4740]: I1009 10:29:45.753151 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:45 crc kubenswrapper[4740]: I1009 10:29:45.753197 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:45 crc kubenswrapper[4740]: I1009 10:29:45.753218 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:45 crc kubenswrapper[4740]: E1009 10:29:45.753348 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:45 crc kubenswrapper[4740]: E1009 10:29:45.753535 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:45 crc kubenswrapper[4740]: E1009 10:29:45.753677 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:46 crc kubenswrapper[4740]: I1009 10:29:46.753124 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:46 crc kubenswrapper[4740]: E1009 10:29:46.753313 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:46 crc kubenswrapper[4740]: E1009 10:29:46.888251 4740 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 10:29:47 crc kubenswrapper[4740]: I1009 10:29:47.753478 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:47 crc kubenswrapper[4740]: I1009 10:29:47.753516 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:47 crc kubenswrapper[4740]: I1009 10:29:47.753576 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:47 crc kubenswrapper[4740]: E1009 10:29:47.753698 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:47 crc kubenswrapper[4740]: E1009 10:29:47.753829 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:47 crc kubenswrapper[4740]: E1009 10:29:47.753919 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:48 crc kubenswrapper[4740]: I1009 10:29:48.753182 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:48 crc kubenswrapper[4740]: E1009 10:29:48.753601 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:48 crc kubenswrapper[4740]: I1009 10:29:48.753892 4740 scope.go:117] "RemoveContainer" containerID="5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518" Oct 09 10:29:49 crc kubenswrapper[4740]: I1009 10:29:49.396905 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/1.log" Oct 09 10:29:49 crc kubenswrapper[4740]: I1009 10:29:49.397422 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qrhgt" event={"ID":"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c","Type":"ContainerStarted","Data":"291dfda6e2a2a98625a59d8fb1e8a1e9ca87c6d5b3650d8087ca2d28c0ae233c"} Oct 09 10:29:49 crc kubenswrapper[4740]: I1009 10:29:49.753111 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:49 crc kubenswrapper[4740]: I1009 10:29:49.753177 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:49 crc kubenswrapper[4740]: E1009 10:29:49.753230 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:49 crc kubenswrapper[4740]: E1009 10:29:49.753323 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:49 crc kubenswrapper[4740]: I1009 10:29:49.753337 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:49 crc kubenswrapper[4740]: E1009 10:29:49.753407 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:50 crc kubenswrapper[4740]: I1009 10:29:50.753389 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:50 crc kubenswrapper[4740]: E1009 10:29:50.753844 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:50 crc kubenswrapper[4740]: I1009 10:29:50.754178 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:29:51 crc kubenswrapper[4740]: I1009 10:29:51.403316 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/3.log" Oct 09 10:29:51 crc kubenswrapper[4740]: I1009 10:29:51.405821 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerStarted","Data":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} Oct 09 10:29:51 crc kubenswrapper[4740]: I1009 10:29:51.406335 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:29:51 crc kubenswrapper[4740]: I1009 10:29:51.430975 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podStartSLOduration=109.430959561 podStartE2EDuration="1m49.430959561s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:29:51.430290373 +0000 UTC m=+130.392490754" watchObservedRunningTime="2025-10-09 10:29:51.430959561 +0000 UTC m=+130.393159942" Oct 09 10:29:51 crc kubenswrapper[4740]: I1009 10:29:51.579621 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z74b9"] Oct 09 10:29:51 crc kubenswrapper[4740]: I1009 10:29:51.579748 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:51 crc kubenswrapper[4740]: E1009 10:29:51.579870 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:51 crc kubenswrapper[4740]: I1009 10:29:51.753479 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:51 crc kubenswrapper[4740]: I1009 10:29:51.753516 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:51 crc kubenswrapper[4740]: E1009 10:29:51.754731 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:51 crc kubenswrapper[4740]: E1009 10:29:51.754844 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:51 crc kubenswrapper[4740]: E1009 10:29:51.888700 4740 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 10:29:52 crc kubenswrapper[4740]: I1009 10:29:52.753594 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:52 crc kubenswrapper[4740]: E1009 10:29:52.753973 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:53 crc kubenswrapper[4740]: I1009 10:29:53.753304 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:53 crc kubenswrapper[4740]: I1009 10:29:53.753335 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:53 crc kubenswrapper[4740]: I1009 10:29:53.753342 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:53 crc kubenswrapper[4740]: E1009 10:29:53.753418 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:53 crc kubenswrapper[4740]: E1009 10:29:53.753635 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:53 crc kubenswrapper[4740]: E1009 10:29:53.753742 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:54 crc kubenswrapper[4740]: I1009 10:29:54.753711 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:54 crc kubenswrapper[4740]: E1009 10:29:54.753921 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:55 crc kubenswrapper[4740]: I1009 10:29:55.752746 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:55 crc kubenswrapper[4740]: I1009 10:29:55.752746 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:55 crc kubenswrapper[4740]: E1009 10:29:55.752929 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z74b9" podUID="01aecf36-9a78-414c-8078-5c114c1dfa3f" Oct 09 10:29:55 crc kubenswrapper[4740]: E1009 10:29:55.752958 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 10:29:55 crc kubenswrapper[4740]: I1009 10:29:55.753230 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:55 crc kubenswrapper[4740]: E1009 10:29:55.753322 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 10:29:56 crc kubenswrapper[4740]: I1009 10:29:56.753178 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:29:56 crc kubenswrapper[4740]: E1009 10:29:56.753310 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.753868 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.753915 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.753988 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.755979 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.756706 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.756929 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.757342 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.757424 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 09 10:29:57 crc kubenswrapper[4740]: I1009 10:29:57.757579 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 09 10:29:58 crc kubenswrapper[4740]: I1009 10:29:58.753289 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.060835 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.130467 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x99pn"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.130953 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.133713 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxzfg"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.133776 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.134003 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.133868 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.133963 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.134952 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.136095 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.136930 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.143343 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pxg57"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.144461 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.146006 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.146341 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.156003 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.156839 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.164178 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.164811 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.176164 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.176576 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.181537 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6zqw2"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.185664 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.186410 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.186937 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.187154 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.187223 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.187407 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.187641 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.187696 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.188032 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.188047 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.188213 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.192404 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.193326 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.193606 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.193786 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.193812 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.193894 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.193976 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.194092 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.194111 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.193608 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.194427 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.210350 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.210779 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.210879 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.211070 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.211138 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.212522 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213061 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213102 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213233 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213377 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213474 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213501 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f67b5"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213556 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213848 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213921 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213861 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.213996 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.214065 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.219064 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.219650 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.222319 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.223460 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.224083 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.224303 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.224709 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.224908 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.226945 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.227134 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.227483 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-g68sq"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.227833 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5pc6m"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.227948 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.227995 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.228424 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qzp8b"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.229164 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.230835 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.231070 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.231238 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.235923 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.236705 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.236825 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.237645 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.238935 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.239117 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qzp8b" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.243862 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.244249 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.244514 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.244961 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.245366 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.245642 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.246866 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.247195 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.247297 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.247349 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.247538 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.247578 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.247683 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.248073 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.248093 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.248317 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.248321 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.248576 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.248699 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.248706 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.248634 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.252342 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.252519 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.264455 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.264788 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.264916 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.264518 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.265269 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.267476 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.268286 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.268290 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.268439 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.268722 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.268733 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.268869 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.268960 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.268979 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.269106 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.269237 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.269364 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.269368 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.269460 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.269568 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.269735 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.271532 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.271647 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.271763 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.272358 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.278144 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.345250 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.345501 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lrpmd"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.346172 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.346558 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pxg57"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.346714 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.347401 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ssz6d"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.347965 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkcp8\" (UniqueName: \"kubernetes.io/projected/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-kube-api-access-bkcp8\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.348092 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.348215 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-audit-policies\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.348311 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlm4d\" (UniqueName: \"kubernetes.io/projected/68bb29c6-1224-44e0-b307-4a2b226288c5-kube-api-access-zlm4d\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.348178 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.348435 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-config\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.348774 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.348926 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-etcd-client\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.349020 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nb9c\" (UniqueName: \"kubernetes.io/projected/833471e4-0651-45ca-aec1-35c2a8a56b5f-kube-api-access-5nb9c\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.349091 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqxth\" (UniqueName: \"kubernetes.io/projected/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-kube-api-access-sqxth\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.349244 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-serving-cert\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.349536 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db1aed22-417f-47ad-a29d-78effc6ac28d-machine-approver-tls\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.349633 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-image-import-ca\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.349745 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.349776 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-policies\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.349951 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350033 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-etcd-client\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350108 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-encryption-config\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350191 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9063b645-eba3-4ba3-a871-23adad70136d-serving-cert\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350252 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350323 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350407 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfff965-33fb-4412-85e1-107e0cf34bf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350489 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bfff965-33fb-4412-85e1-107e0cf34bf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350560 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-trusted-ca\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350679 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350773 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hz6d\" (UniqueName: \"kubernetes.io/projected/9063b645-eba3-4ba3-a871-23adad70136d-kube-api-access-2hz6d\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350854 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db1aed22-417f-47ad-a29d-78effc6ac28d-config\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.350941 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.351033 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.351111 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.351454 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.351582 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-config\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.351714 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58xh4\" (UniqueName: \"kubernetes.io/projected/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-kube-api-access-58xh4\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.351861 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-encryption-config\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.351944 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-client-ca\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352041 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-client-ca\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352127 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352203 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s478v\" (UniqueName: \"kubernetes.io/projected/6bfff965-33fb-4412-85e1-107e0cf34bf8-kube-api-access-s478v\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352153 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352274 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352447 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352535 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc8742-b2a1-42a1-b78e-11e736801124-serving-cert\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352603 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-serving-cert\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352714 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-serving-cert\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.352884 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-images\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353049 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g6tmc"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353127 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db1aed22-417f-47ad-a29d-78effc6ac28d-auth-proxy-config\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353209 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-config\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353300 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353385 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353494 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-config\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353568 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-config\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353631 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353701 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/833471e4-0651-45ca-aec1-35c2a8a56b5f-audit-dir\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353642 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353790 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvg5x\" (UniqueName: \"kubernetes.io/projected/db1aed22-417f-47ad-a29d-78effc6ac28d-kube-api-access-kvg5x\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.353927 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.354007 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-audit\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.354081 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4hsd\" (UniqueName: \"kubernetes.io/projected/63fc8742-b2a1-42a1-b78e-11e736801124-kube-api-access-b4hsd\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.354154 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/68bb29c6-1224-44e0-b307-4a2b226288c5-node-pullsecrets\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.354296 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.354377 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68bb29c6-1224-44e0-b307-4a2b226288c5-audit-dir\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.354512 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-dir\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.356109 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-828hm"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.356944 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.357131 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.357256 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.357721 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.358169 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.358378 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.358630 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.358646 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.358907 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.360528 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.362475 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.363625 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f67b5"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.365819 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.365930 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.366577 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.367638 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.368539 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.368660 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kp59b"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.369571 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.370881 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxkqd"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.371699 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.371904 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vntt9"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.372508 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.372961 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.373870 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.374648 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.375180 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.375278 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.377681 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.379319 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g68sq"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.379419 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x99pn"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.379414 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.383318 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.385521 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.385685 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.389144 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxzfg"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.389639 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.391157 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jrhp4"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.392551 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.392994 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.394691 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-842d9"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.395535 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.395566 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.399012 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.399502 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.399604 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.399922 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.400076 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.401566 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.402396 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.405190 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lrpmd"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.405225 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.408041 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.408072 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.408129 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.410070 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.411164 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.412510 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.412852 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6zqw2"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.413858 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-828hm"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.415918 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ssz6d"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.417054 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.417552 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qzp8b"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.418951 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5pc6m"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.420222 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g6tmc"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.422028 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.423008 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.423458 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.423517 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-95p7x"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.424638 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.424794 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.427075 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxkqd"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.428469 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.430142 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.431552 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-842d9"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.433051 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.433200 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.434993 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kp59b"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.436598 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.439233 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.440421 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-95p7x"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.441574 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.443616 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xhb6x"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.444550 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.445010 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-c78zk"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.446384 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c78zk" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.446667 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xhb6x"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.447844 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c78zk"] Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.453059 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455169 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58xh4\" (UniqueName: \"kubernetes.io/projected/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-kube-api-access-58xh4\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455200 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-encryption-config\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455217 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-client-ca\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455235 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-client-ca\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455253 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455270 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s478v\" (UniqueName: \"kubernetes.io/projected/6bfff965-33fb-4412-85e1-107e0cf34bf8-kube-api-access-s478v\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455286 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455303 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455322 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-config\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455342 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc8742-b2a1-42a1-b78e-11e736801124-serving-cert\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455360 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-serving-cert\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455377 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-serving-cert\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455393 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-service-ca\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455419 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db1aed22-417f-47ad-a29d-78effc6ac28d-auth-proxy-config\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455445 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-config\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455461 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455479 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47rtd\" (UniqueName: \"kubernetes.io/projected/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-kube-api-access-47rtd\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455494 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-images\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455510 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455529 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-config\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455554 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455591 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-config\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455613 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/833471e4-0651-45ca-aec1-35c2a8a56b5f-audit-dir\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455709 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvg5x\" (UniqueName: \"kubernetes.io/projected/db1aed22-417f-47ad-a29d-78effc6ac28d-kube-api-access-kvg5x\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455734 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-client\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-audit\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455815 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/68bb29c6-1224-44e0-b307-4a2b226288c5-node-pullsecrets\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455836 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4hsd\" (UniqueName: \"kubernetes.io/projected/63fc8742-b2a1-42a1-b78e-11e736801124-kube-api-access-b4hsd\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455856 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68bb29c6-1224-44e0-b307-4a2b226288c5-audit-dir\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455898 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-dir\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455925 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkcp8\" (UniqueName: \"kubernetes.io/projected/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-kube-api-access-bkcp8\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455948 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455972 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7r5q\" (UniqueName: \"kubernetes.io/projected/974f7f71-f43b-4a14-bac3-567229c728c7-kube-api-access-j7r5q\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.455999 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-audit-policies\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456023 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlm4d\" (UniqueName: \"kubernetes.io/projected/68bb29c6-1224-44e0-b307-4a2b226288c5-kube-api-access-zlm4d\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456059 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-config\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456095 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456120 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-etcd-client\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456150 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqxth\" (UniqueName: \"kubernetes.io/projected/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-kube-api-access-sqxth\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456235 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456257 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nb9c\" (UniqueName: \"kubernetes.io/projected/833471e4-0651-45ca-aec1-35c2a8a56b5f-kube-api-access-5nb9c\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456312 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-serving-cert\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456347 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db1aed22-417f-47ad-a29d-78effc6ac28d-machine-approver-tls\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456380 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-image-import-ca\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456383 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-client-ca\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456408 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-policies\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456437 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456442 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456503 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974f7f71-f43b-4a14-bac3-567229c728c7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456540 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-etcd-client\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456579 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9063b645-eba3-4ba3-a871-23adad70136d-serving-cert\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456607 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456638 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-encryption-config\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456666 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfff965-33fb-4412-85e1-107e0cf34bf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456701 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974f7f71-f43b-4a14-bac3-567229c728c7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456734 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bfff965-33fb-4412-85e1-107e0cf34bf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456768 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68bb29c6-1224-44e0-b307-4a2b226288c5-audit-dir\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456785 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-trusted-ca\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456819 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456849 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hz6d\" (UniqueName: \"kubernetes.io/projected/9063b645-eba3-4ba3-a871-23adad70136d-kube-api-access-2hz6d\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456896 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db1aed22-417f-47ad-a29d-78effc6ac28d-config\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456923 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456933 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-dir\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456947 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456976 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.457045 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-ca\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.457087 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.457131 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-config\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.457165 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-serving-cert\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.457830 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-images\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.458533 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.458602 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-client-ca\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.460137 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-config\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.460626 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/833471e4-0651-45ca-aec1-35c2a8a56b5f-audit-dir\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.461386 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-audit\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.461467 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db1aed22-417f-47ad-a29d-78effc6ac28d-auth-proxy-config\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.461519 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/68bb29c6-1224-44e0-b307-4a2b226288c5-node-pullsecrets\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.461577 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-config\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.462656 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.462917 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-config\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.463685 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.464025 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/833471e4-0651-45ca-aec1-35c2a8a56b5f-audit-policies\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.464992 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-image-import-ca\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.465794 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-config\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.468173 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-policies\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.456634 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68bb29c6-1224-44e0-b307-4a2b226288c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.470411 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.471233 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db1aed22-417f-47ad-a29d-78effc6ac28d-config\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.471711 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bfff965-33fb-4412-85e1-107e0cf34bf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.471839 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-config\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.472201 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.472361 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-serving-cert\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.473557 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-serving-cert\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.474288 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.474562 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-trusted-ca\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.475539 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.476108 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.476168 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-etcd-client\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.476290 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.476405 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/833471e4-0651-45ca-aec1-35c2a8a56b5f-encryption-config\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.477007 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.477034 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.477041 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.477469 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9063b645-eba3-4ba3-a871-23adad70136d-serving-cert\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.477476 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc8742-b2a1-42a1-b78e-11e736801124-serving-cert\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.477489 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-serving-cert\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.477703 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db1aed22-417f-47ad-a29d-78effc6ac28d-machine-approver-tls\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.478342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-etcd-client\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.478642 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfff965-33fb-4412-85e1-107e0cf34bf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.478796 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.479083 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.479389 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.480053 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68bb29c6-1224-44e0-b307-4a2b226288c5-encryption-config\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.494476 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.512778 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.540396 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.553063 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.557836 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-client\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.557931 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7r5q\" (UniqueName: \"kubernetes.io/projected/974f7f71-f43b-4a14-bac3-567229c728c7-kube-api-access-j7r5q\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.558018 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974f7f71-f43b-4a14-bac3-567229c728c7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.558057 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974f7f71-f43b-4a14-bac3-567229c728c7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.558124 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-ca\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.558162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-serving-cert\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.558227 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-config\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.558259 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-service-ca\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.558302 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47rtd\" (UniqueName: \"kubernetes.io/projected/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-kube-api-access-47rtd\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.559991 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974f7f71-f43b-4a14-bac3-567229c728c7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.561358 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974f7f71-f43b-4a14-bac3-567229c728c7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.573160 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.612767 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.619388 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-service-ca\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.653020 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.673097 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.694047 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.699431 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-config\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.713331 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.721935 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-serving-cert\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.734104 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.741597 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-client\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.754542 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.774613 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.780093 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-etcd-ca\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.794152 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.812855 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.833488 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.852920 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.874442 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.893824 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.913357 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.933441 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.953592 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.974084 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 09 10:30:00 crc kubenswrapper[4740]: I1009 10:30:00.994417 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.013962 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.033638 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.054111 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.074860 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.093384 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.113945 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.133821 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.154397 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.174122 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.194149 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.220267 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.232883 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.253452 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.274067 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.293321 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.313962 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.333489 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.353293 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.373702 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.391905 4740 request.go:700] Waited for 1.01853919s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.393774 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.414120 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.433048 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.453260 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.473716 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.493687 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.513557 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.533144 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.554453 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.574421 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.594082 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.614190 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.633615 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.653834 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.677161 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.694023 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.713466 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.733387 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.753478 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.773843 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.794356 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.813820 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.833560 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.853179 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.873680 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.893155 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.913554 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.933894 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.953135 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.972686 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 10:30:01 crc kubenswrapper[4740]: I1009 10:30:01.993511 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.014246 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.033343 4740 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.053425 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.074024 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.093223 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.114035 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.133932 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.154554 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.173900 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.220803 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s478v\" (UniqueName: \"kubernetes.io/projected/6bfff965-33fb-4412-85e1-107e0cf34bf8-kube-api-access-s478v\") pod \"openshift-apiserver-operator-796bbdcf4f-2tnv4\" (UID: \"6bfff965-33fb-4412-85e1-107e0cf34bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.236412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58xh4\" (UniqueName: \"kubernetes.io/projected/2cfbd3fb-f7f5-4578-9e24-72dbd185cf12-kube-api-access-58xh4\") pod \"machine-api-operator-5694c8668f-pxg57\" (UID: \"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.254886 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nb9c\" (UniqueName: \"kubernetes.io/projected/833471e4-0651-45ca-aec1-35c2a8a56b5f-kube-api-access-5nb9c\") pod \"apiserver-7bbb656c7d-6srf8\" (UID: \"833471e4-0651-45ca-aec1-35c2a8a56b5f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.280786 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkcp8\" (UniqueName: \"kubernetes.io/projected/a23bc8cd-dc20-4ade-88cd-1c61d1f6315f-kube-api-access-bkcp8\") pod \"console-operator-58897d9998-f67b5\" (UID: \"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f\") " pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.294077 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.297379 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvg5x\" (UniqueName: \"kubernetes.io/projected/db1aed22-417f-47ad-a29d-78effc6ac28d-kube-api-access-kvg5x\") pod \"machine-approver-56656f9798-wscmf\" (UID: \"db1aed22-417f-47ad-a29d-78effc6ac28d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.312074 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4hsd\" (UniqueName: \"kubernetes.io/projected/63fc8742-b2a1-42a1-b78e-11e736801124-kube-api-access-b4hsd\") pod \"controller-manager-879f6c89f-x99pn\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.321777 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.326056 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.329163 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlm4d\" (UniqueName: \"kubernetes.io/projected/68bb29c6-1224-44e0-b307-4a2b226288c5-kube-api-access-zlm4d\") pod \"apiserver-76f77b778f-lxzfg\" (UID: \"68bb29c6-1224-44e0-b307-4a2b226288c5\") " pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:02 crc kubenswrapper[4740]: W1009 10:30:02.342962 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb1aed22_417f_47ad_a29d_78effc6ac28d.slice/crio-793fe030e5b165bd84503ac7478afbcf91932dfe669a702b137cf3be36c79e97 WatchSource:0}: Error finding container 793fe030e5b165bd84503ac7478afbcf91932dfe669a702b137cf3be36c79e97: Status 404 returned error can't find the container with id 793fe030e5b165bd84503ac7478afbcf91932dfe669a702b137cf3be36c79e97 Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.348193 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hz6d\" (UniqueName: \"kubernetes.io/projected/9063b645-eba3-4ba3-a871-23adad70136d-kube-api-access-2hz6d\") pod \"route-controller-manager-6576b87f9c-5bn7g\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.350155 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.369458 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqxth\" (UniqueName: \"kubernetes.io/projected/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-kube-api-access-sqxth\") pod \"oauth-openshift-558db77b4-6zqw2\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.383973 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.395333 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7r5q\" (UniqueName: \"kubernetes.io/projected/974f7f71-f43b-4a14-bac3-567229c728c7-kube-api-access-j7r5q\") pod \"openshift-controller-manager-operator-756b6f6bc6-fs6nz\" (UID: \"974f7f71-f43b-4a14-bac3-567229c728c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.407493 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47rtd\" (UniqueName: \"kubernetes.io/projected/bd8ebc5c-e47c-4177-968f-a3c924dbda0e-kube-api-access-47rtd\") pod \"etcd-operator-b45778765-ssz6d\" (UID: \"bd8ebc5c-e47c-4177-968f-a3c924dbda0e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.411909 4740 request.go:700] Waited for 1.816674425s due to client-side throttling, not priority and fairness, request: PATCH:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/pods/apiserver-7bbb656c7d-6srf8/status Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.445014 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" event={"ID":"db1aed22-417f-47ad-a29d-78effc6ac28d","Type":"ContainerStarted","Data":"793fe030e5b165bd84503ac7478afbcf91932dfe669a702b137cf3be36c79e97"} Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.484946 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-config\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.484971 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-oauth-serving-cert\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.484990 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-config\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485030 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd568fb3-5f33-4412-a84b-c37d56678927-config\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485063 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485083 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-bound-sa-token\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485099 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/626a56dc-ba4f-4ff6-a787-8f60403b4d42-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485122 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0daf28c1-6a40-4a53-a196-521d95be9aab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485143 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-registry-certificates\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485160 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29e2bc86-ce7e-4abd-93d7-7adf15987e18-serving-cert\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485176 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btf9s\" (UniqueName: \"kubernetes.io/projected/645bfb4f-f372-4d0a-99da-d6942d8b773c-kube-api-access-btf9s\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485212 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0daf28c1-6a40-4a53-a196-521d95be9aab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485237 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4xkk\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-kube-api-access-t4xkk\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485254 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-metrics-tls\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485268 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft7tr\" (UniqueName: \"kubernetes.io/projected/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-kube-api-access-ft7tr\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485282 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/645bfb4f-f372-4d0a-99da-d6942d8b773c-serving-cert\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485306 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/513aa088-5f0d-479a-9668-e8ae80738297-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485322 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-service-ca\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485366 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-trusted-ca\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485383 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-serving-cert\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485415 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29e2bc86-ce7e-4abd-93d7-7adf15987e18-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485432 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-trusted-ca\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485461 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/626a56dc-ba4f-4ff6-a787-8f60403b4d42-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485478 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxqx9\" (UniqueName: \"kubernetes.io/projected/29e2bc86-ce7e-4abd-93d7-7adf15987e18-kube-api-access-gxqx9\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485494 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-registry-tls\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485509 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/626a56dc-ba4f-4ff6-a787-8f60403b4d42-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485531 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485548 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-trusted-ca-bundle\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485566 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt9nw\" (UniqueName: \"kubernetes.io/projected/f2de169d-9583-46e5-b2ee-da1a6903eafb-kube-api-access-pt9nw\") pod \"downloads-7954f5f757-qzp8b\" (UID: \"f2de169d-9583-46e5-b2ee-da1a6903eafb\") " pod="openshift-console/downloads-7954f5f757-qzp8b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485590 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v8nt\" (UniqueName: \"kubernetes.io/projected/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-kube-api-access-8v8nt\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485614 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd568fb3-5f33-4412-a84b-c37d56678927-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485629 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd568fb3-5f33-4412-a84b-c37d56678927-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485646 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0daf28c1-6a40-4a53-a196-521d95be9aab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485664 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485682 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c338011-b98d-4a6b-b48e-76025b1f0973-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pvppr\" (UID: \"6c338011-b98d-4a6b-b48e-76025b1f0973\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485698 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-service-ca-bundle\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k89jq\" (UniqueName: \"kubernetes.io/projected/6c338011-b98d-4a6b-b48e-76025b1f0973-kube-api-access-k89jq\") pod \"cluster-samples-operator-665b6dd947-pvppr\" (UID: \"6c338011-b98d-4a6b-b48e-76025b1f0973\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485792 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/513aa088-5f0d-479a-9668-e8ae80738297-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485827 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9tbx\" (UniqueName: \"kubernetes.io/projected/626a56dc-ba4f-4ff6-a787-8f60403b4d42-kube-api-access-d9tbx\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.485898 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-oauth-config\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: E1009 10:30:02.486578 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:02.986566337 +0000 UTC m=+141.948766718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.500625 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pxg57"] Oct 09 10:30:02 crc kubenswrapper[4740]: W1009 10:30:02.511201 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cfbd3fb_f7f5_4578_9e24_72dbd185cf12.slice/crio-99894b4c04f408a879a7bf9e4ec9863404beadc8b1d4981bbb5abccc46256ea2 WatchSource:0}: Error finding container 99894b4c04f408a879a7bf9e4ec9863404beadc8b1d4981bbb5abccc46256ea2: Status 404 returned error can't find the container with id 99894b4c04f408a879a7bf9e4ec9863404beadc8b1d4981bbb5abccc46256ea2 Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.524406 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.544377 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.553618 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.556314 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.562484 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.566510 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8"] Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586484 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:02 crc kubenswrapper[4740]: E1009 10:30:02.586673 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.086651433 +0000 UTC m=+142.048851814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586726 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29e2bc86-ce7e-4abd-93d7-7adf15987e18-serving-cert\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586780 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btf9s\" (UniqueName: \"kubernetes.io/projected/645bfb4f-f372-4d0a-99da-d6942d8b773c-kube-api-access-btf9s\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586810 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-metrics-tls\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586832 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/645bfb4f-f372-4d0a-99da-d6942d8b773c-serving-cert\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586859 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-registration-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586895 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1616614d-03d7-42ee-913f-711b77d1032f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-w956x\" (UID: \"1616614d-03d7-42ee-913f-711b77d1032f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586918 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8d56056-c06c-441c-8936-0416f53f5da0-cert\") pod \"ingress-canary-c78zk\" (UID: \"b8d56056-c06c-441c-8936-0416f53f5da0\") " pod="openshift-ingress-canary/ingress-canary-c78zk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586939 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d7e13929-564e-49dd-baab-987ea26c55a3-certs\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586960 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-trusted-ca\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.586980 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6brpw\" (UniqueName: \"kubernetes.io/projected/1616614d-03d7-42ee-913f-711b77d1032f-kube-api-access-6brpw\") pod \"package-server-manager-789f6589d5-w956x\" (UID: \"1616614d-03d7-42ee-913f-711b77d1032f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587001 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd6215a-2d0b-48c5-be33-130bb55803c7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587020 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd6215a-2d0b-48c5-be33-130bb55803c7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587044 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/626a56dc-ba4f-4ff6-a787-8f60403b4d42-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587064 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-socket-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587133 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/626a56dc-ba4f-4ff6-a787-8f60403b4d42-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587161 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxqx9\" (UniqueName: \"kubernetes.io/projected/29e2bc86-ce7e-4abd-93d7-7adf15987e18-kube-api-access-gxqx9\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587184 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-plugins-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587208 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-registry-tls\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587247 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587270 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-trusted-ca-bundle\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587294 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d9654ef-644a-4274-b02b-c8eaf9d53a96-srv-cert\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587318 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4411b16-07f8-4701-ad4f-7645a00e829f-config-volume\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587341 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d53b55b-dbd8-420d-bc00-128f7d5e1580-serving-cert\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587365 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v8nt\" (UniqueName: \"kubernetes.io/projected/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-kube-api-access-8v8nt\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587388 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt9nw\" (UniqueName: \"kubernetes.io/projected/f2de169d-9583-46e5-b2ee-da1a6903eafb-kube-api-access-pt9nw\") pod \"downloads-7954f5f757-qzp8b\" (UID: \"f2de169d-9583-46e5-b2ee-da1a6903eafb\") " pod="openshift-console/downloads-7954f5f757-qzp8b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587412 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1705605-4391-45da-a171-23f5a7e0ff74-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kp59b\" (UID: \"c1705605-4391-45da-a171-23f5a7e0ff74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587438 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0daf28c1-6a40-4a53-a196-521d95be9aab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587486 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-tmpfs\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587509 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fbgt\" (UniqueName: \"kubernetes.io/projected/7b04fa40-8401-462e-8fc1-c55dbca89bbc-kube-api-access-6fbgt\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587533 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74sw\" (UniqueName: \"kubernetes.io/projected/b2af699f-f757-4c89-ba00-55f9f8599fda-kube-api-access-n74sw\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587558 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7b04fa40-8401-462e-8fc1-c55dbca89bbc-signing-cabundle\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587581 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587605 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587628 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93d94603-9462-4c78-9a9f-ee66522eb4cf-metrics-tls\") pod \"dns-operator-744455d44c-g6tmc\" (UID: \"93d94603-9462-4c78-9a9f-ee66522eb4cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587654 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-service-ca-bundle\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587678 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/513aa088-5f0d-479a-9668-e8ae80738297-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587701 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2af699f-f757-4c89-ba00-55f9f8599fda-proxy-tls\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.587739 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9tbx\" (UniqueName: \"kubernetes.io/projected/626a56dc-ba4f-4ff6-a787-8f60403b4d42-kube-api-access-d9tbx\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588121 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7b04fa40-8401-462e-8fc1-c55dbca89bbc-signing-key\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588148 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-default-certificate\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588171 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588192 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d7e13929-564e-49dd-baab-987ea26c55a3-node-bootstrap-token\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588227 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-config-volume\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588253 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588279 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-config\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588302 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-oauth-serving-cert\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588324 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnvp9\" (UniqueName: \"kubernetes.io/projected/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-kube-api-access-fnvp9\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588361 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dsw4\" (UniqueName: \"kubernetes.io/projected/c1705605-4391-45da-a171-23f5a7e0ff74-kube-api-access-5dsw4\") pod \"multus-admission-controller-857f4d67dd-kp59b\" (UID: \"c1705605-4391-45da-a171-23f5a7e0ff74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588397 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8ww7\" (UniqueName: \"kubernetes.io/projected/e4411b16-07f8-4701-ad4f-7645a00e829f-kube-api-access-j8ww7\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588420 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4411b16-07f8-4701-ad4f-7645a00e829f-secret-volume\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588445 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-bound-sa-token\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588469 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-webhook-cert\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588491 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxtj\" (UniqueName: \"kubernetes.io/projected/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-kube-api-access-jfxtj\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588514 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d9654ef-644a-4274-b02b-c8eaf9d53a96-profile-collector-cert\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588552 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0daf28c1-6a40-4a53-a196-521d95be9aab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588573 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-config\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588594 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-proxy-tls\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588622 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft7tr\" (UniqueName: \"kubernetes.io/projected/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-kube-api-access-ft7tr\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588645 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-trusted-ca\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588648 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4xkk\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-kube-api-access-t4xkk\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588697 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-stats-auth\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588720 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-mountpoint-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588773 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/513aa088-5f0d-479a-9668-e8ae80738297-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588797 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-service-ca\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588819 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-trusted-ca\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588840 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-serving-cert\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588878 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-csi-data-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588900 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2af699f-f757-4c89-ba00-55f9f8599fda-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588927 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29e2bc86-ce7e-4abd-93d7-7adf15987e18-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588948 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-apiservice-cert\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.588984 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ts8p\" (UniqueName: \"kubernetes.io/projected/4d53b55b-dbd8-420d-bc00-128f7d5e1580-kube-api-access-4ts8p\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589010 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d53b55b-dbd8-420d-bc00-128f7d5e1580-config\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589034 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-metrics-certs\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589070 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd568fb3-5f33-4412-a84b-c37d56678927-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589091 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd568fb3-5f33-4412-a84b-c37d56678927-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589111 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-metrics-tls\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589133 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-service-ca-bundle\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589165 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b2af699f-f757-4c89-ba00-55f9f8599fda-images\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589193 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c338011-b98d-4a6b-b48e-76025b1f0973-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pvppr\" (UID: \"6c338011-b98d-4a6b-b48e-76025b1f0973\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589219 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7bw8\" (UniqueName: \"kubernetes.io/projected/d7e13929-564e-49dd-baab-987ea26c55a3-kube-api-access-d7bw8\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589246 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k89jq\" (UniqueName: \"kubernetes.io/projected/6c338011-b98d-4a6b-b48e-76025b1f0973-kube-api-access-k89jq\") pod \"cluster-samples-operator-665b6dd947-pvppr\" (UID: \"6c338011-b98d-4a6b-b48e-76025b1f0973\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589302 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1412af2-816f-44a8-9862-b1a86ea6b9bc-srv-cert\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589343 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h2wg\" (UniqueName: \"kubernetes.io/projected/9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1-kube-api-access-2h2wg\") pod \"migrator-59844c95c7-qwphn\" (UID: \"9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589364 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb7sf\" (UniqueName: \"kubernetes.io/projected/91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b-kube-api-access-sb7sf\") pod \"control-plane-machine-set-operator-78cbb6b69f-mr7wc\" (UID: \"91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589402 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-oauth-config\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589424 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdlc9\" (UniqueName: \"kubernetes.io/projected/93d94603-9462-4c78-9a9f-ee66522eb4cf-kube-api-access-hdlc9\") pod \"dns-operator-744455d44c-g6tmc\" (UID: \"93d94603-9462-4c78-9a9f-ee66522eb4cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589442 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsbm\" (UniqueName: \"kubernetes.io/projected/7d9654ef-644a-4274-b02b-c8eaf9d53a96-kube-api-access-ljsbm\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589461 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54zft\" (UniqueName: \"kubernetes.io/projected/07b58f91-881e-4c94-96b6-ff6126e39824-kube-api-access-54zft\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589494 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589515 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rjl\" (UniqueName: \"kubernetes.io/projected/494881f5-dbba-48c3-9871-c8d81136eda3-kube-api-access-54rjl\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589538 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd568fb3-5f33-4412-a84b-c37d56678927-config\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589558 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-config\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589578 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24hmr\" (UniqueName: \"kubernetes.io/projected/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-kube-api-access-24hmr\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589597 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1412af2-816f-44a8-9862-b1a86ea6b9bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589619 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589642 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mr7wc\" (UID: \"91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589666 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589686 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s9vv\" (UniqueName: \"kubernetes.io/projected/a1412af2-816f-44a8-9862-b1a86ea6b9bc-kube-api-access-6s9vv\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589709 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmwhj\" (UniqueName: \"kubernetes.io/projected/2fd6215a-2d0b-48c5-be33-130bb55803c7-kube-api-access-gmwhj\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589733 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4rqb\" (UniqueName: \"kubernetes.io/projected/b8d56056-c06c-441c-8936-0416f53f5da0-kube-api-access-s4rqb\") pod \"ingress-canary-c78zk\" (UID: \"b8d56056-c06c-441c-8936-0416f53f5da0\") " pod="openshift-ingress-canary/ingress-canary-c78zk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589774 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/626a56dc-ba4f-4ff6-a787-8f60403b4d42-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589798 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0daf28c1-6a40-4a53-a196-521d95be9aab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589833 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twbl9\" (UniqueName: \"kubernetes.io/projected/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-kube-api-access-twbl9\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.589858 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-registry-certificates\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.591081 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-registry-certificates\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.591283 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29e2bc86-ce7e-4abd-93d7-7adf15987e18-serving-cert\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.591415 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/513aa088-5f0d-479a-9668-e8ae80738297-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.591516 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-metrics-tls\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.592345 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-config\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.592546 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-service-ca\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.592738 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/645bfb4f-f372-4d0a-99da-d6942d8b773c-serving-cert\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.593320 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-registry-tls\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.595399 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-trusted-ca\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.595625 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/626a56dc-ba4f-4ff6-a787-8f60403b4d42-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.595679 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/626a56dc-ba4f-4ff6-a787-8f60403b4d42-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.595834 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-oauth-config\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.596268 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0daf28c1-6a40-4a53-a196-521d95be9aab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.596479 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd568fb3-5f33-4412-a84b-c37d56678927-config\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: E1009 10:30:02.596558 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.096544684 +0000 UTC m=+142.058745065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.596718 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29e2bc86-ce7e-4abd-93d7-7adf15987e18-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.597352 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-trusted-ca-bundle\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.598004 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-serving-cert\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.598075 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.598552 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-config\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.598642 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/645bfb4f-f372-4d0a-99da-d6942d8b773c-service-ca-bundle\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.599358 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-oauth-serving-cert\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.599572 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0daf28c1-6a40-4a53-a196-521d95be9aab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.601527 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c338011-b98d-4a6b-b48e-76025b1f0973-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pvppr\" (UID: \"6c338011-b98d-4a6b-b48e-76025b1f0973\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.602335 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd568fb3-5f33-4412-a84b-c37d56678927-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.607729 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/513aa088-5f0d-479a-9668-e8ae80738297-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.630969 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btf9s\" (UniqueName: \"kubernetes.io/projected/645bfb4f-f372-4d0a-99da-d6942d8b773c-kube-api-access-btf9s\") pod \"authentication-operator-69f744f599-lrpmd\" (UID: \"645bfb4f-f372-4d0a-99da-d6942d8b773c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.637580 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.641542 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4"] Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.649822 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/626a56dc-ba4f-4ff6-a787-8f60403b4d42-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.674167 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4xkk\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-kube-api-access-t4xkk\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693363 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:02 crc kubenswrapper[4740]: E1009 10:30:02.693792 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.193745404 +0000 UTC m=+142.155945785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693858 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-stats-auth\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-mountpoint-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693898 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-csi-data-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693917 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2af699f-f757-4c89-ba00-55f9f8599fda-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693936 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-apiservice-cert\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693954 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ts8p\" (UniqueName: \"kubernetes.io/projected/4d53b55b-dbd8-420d-bc00-128f7d5e1580-kube-api-access-4ts8p\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693969 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d53b55b-dbd8-420d-bc00-128f7d5e1580-config\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.693984 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-metrics-certs\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694015 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-metrics-tls\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694031 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-service-ca-bundle\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694046 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b2af699f-f757-4c89-ba00-55f9f8599fda-images\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694067 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7bw8\" (UniqueName: \"kubernetes.io/projected/d7e13929-564e-49dd-baab-987ea26c55a3-kube-api-access-d7bw8\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694093 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1412af2-816f-44a8-9862-b1a86ea6b9bc-srv-cert\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694110 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h2wg\" (UniqueName: \"kubernetes.io/projected/9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1-kube-api-access-2h2wg\") pod \"migrator-59844c95c7-qwphn\" (UID: \"9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694127 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb7sf\" (UniqueName: \"kubernetes.io/projected/91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b-kube-api-access-sb7sf\") pod \"control-plane-machine-set-operator-78cbb6b69f-mr7wc\" (UID: \"91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694185 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdlc9\" (UniqueName: \"kubernetes.io/projected/93d94603-9462-4c78-9a9f-ee66522eb4cf-kube-api-access-hdlc9\") pod \"dns-operator-744455d44c-g6tmc\" (UID: \"93d94603-9462-4c78-9a9f-ee66522eb4cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694202 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsbm\" (UniqueName: \"kubernetes.io/projected/7d9654ef-644a-4274-b02b-c8eaf9d53a96-kube-api-access-ljsbm\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694217 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54zft\" (UniqueName: \"kubernetes.io/projected/07b58f91-881e-4c94-96b6-ff6126e39824-kube-api-access-54zft\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694234 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694249 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54rjl\" (UniqueName: \"kubernetes.io/projected/494881f5-dbba-48c3-9871-c8d81136eda3-kube-api-access-54rjl\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694263 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1412af2-816f-44a8-9862-b1a86ea6b9bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694283 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24hmr\" (UniqueName: \"kubernetes.io/projected/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-kube-api-access-24hmr\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694303 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694320 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mr7wc\" (UID: \"91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694336 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s9vv\" (UniqueName: \"kubernetes.io/projected/a1412af2-816f-44a8-9862-b1a86ea6b9bc-kube-api-access-6s9vv\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694353 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmwhj\" (UniqueName: \"kubernetes.io/projected/2fd6215a-2d0b-48c5-be33-130bb55803c7-kube-api-access-gmwhj\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694371 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twbl9\" (UniqueName: \"kubernetes.io/projected/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-kube-api-access-twbl9\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694393 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4rqb\" (UniqueName: \"kubernetes.io/projected/b8d56056-c06c-441c-8936-0416f53f5da0-kube-api-access-s4rqb\") pod \"ingress-canary-c78zk\" (UID: \"b8d56056-c06c-441c-8936-0416f53f5da0\") " pod="openshift-ingress-canary/ingress-canary-c78zk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694425 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-registration-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694446 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1616614d-03d7-42ee-913f-711b77d1032f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-w956x\" (UID: \"1616614d-03d7-42ee-913f-711b77d1032f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694462 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8d56056-c06c-441c-8936-0416f53f5da0-cert\") pod \"ingress-canary-c78zk\" (UID: \"b8d56056-c06c-441c-8936-0416f53f5da0\") " pod="openshift-ingress-canary/ingress-canary-c78zk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694476 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d7e13929-564e-49dd-baab-987ea26c55a3-certs\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694494 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6brpw\" (UniqueName: \"kubernetes.io/projected/1616614d-03d7-42ee-913f-711b77d1032f-kube-api-access-6brpw\") pod \"package-server-manager-789f6589d5-w956x\" (UID: \"1616614d-03d7-42ee-913f-711b77d1032f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694508 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-socket-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694524 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd6215a-2d0b-48c5-be33-130bb55803c7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694570 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd6215a-2d0b-48c5-be33-130bb55803c7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694591 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-plugins-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694607 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4411b16-07f8-4701-ad4f-7645a00e829f-config-volume\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694629 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694645 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d9654ef-644a-4274-b02b-c8eaf9d53a96-srv-cert\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694660 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d53b55b-dbd8-420d-bc00-128f7d5e1580-serving-cert\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694688 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1705605-4391-45da-a171-23f5a7e0ff74-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kp59b\" (UID: \"c1705605-4391-45da-a171-23f5a7e0ff74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694705 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-tmpfs\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694720 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fbgt\" (UniqueName: \"kubernetes.io/projected/7b04fa40-8401-462e-8fc1-c55dbca89bbc-kube-api-access-6fbgt\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694735 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n74sw\" (UniqueName: \"kubernetes.io/projected/b2af699f-f757-4c89-ba00-55f9f8599fda-kube-api-access-n74sw\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694770 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7b04fa40-8401-462e-8fc1-c55dbca89bbc-signing-cabundle\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694789 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694804 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93d94603-9462-4c78-9a9f-ee66522eb4cf-metrics-tls\") pod \"dns-operator-744455d44c-g6tmc\" (UID: \"93d94603-9462-4c78-9a9f-ee66522eb4cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694828 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2af699f-f757-4c89-ba00-55f9f8599fda-proxy-tls\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694848 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d7e13929-564e-49dd-baab-987ea26c55a3-node-bootstrap-token\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694872 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7b04fa40-8401-462e-8fc1-c55dbca89bbc-signing-key\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694887 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-default-certificate\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694902 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694927 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-config-volume\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694949 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.694980 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnvp9\" (UniqueName: \"kubernetes.io/projected/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-kube-api-access-fnvp9\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695001 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8ww7\" (UniqueName: \"kubernetes.io/projected/e4411b16-07f8-4701-ad4f-7645a00e829f-kube-api-access-j8ww7\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695017 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dsw4\" (UniqueName: \"kubernetes.io/projected/c1705605-4391-45da-a171-23f5a7e0ff74-kube-api-access-5dsw4\") pod \"multus-admission-controller-857f4d67dd-kp59b\" (UID: \"c1705605-4391-45da-a171-23f5a7e0ff74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695032 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4411b16-07f8-4701-ad4f-7645a00e829f-secret-volume\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695051 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-webhook-cert\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695065 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxtj\" (UniqueName: \"kubernetes.io/projected/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-kube-api-access-jfxtj\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695080 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d9654ef-644a-4274-b02b-c8eaf9d53a96-profile-collector-cert\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695095 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-proxy-tls\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695115 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-config\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.695964 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-config\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.699095 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-socket-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.701858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4411b16-07f8-4701-ad4f-7645a00e829f-secret-volume\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.702420 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d9654ef-644a-4274-b02b-c8eaf9d53a96-srv-cert\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.702959 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d9654ef-644a-4274-b02b-c8eaf9d53a96-profile-collector-cert\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.702987 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2af699f-f757-4c89-ba00-55f9f8599fda-proxy-tls\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.703400 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd6215a-2d0b-48c5-be33-130bb55803c7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:02 crc kubenswrapper[4740]: E1009 10:30:02.703662 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.203636765 +0000 UTC m=+142.165837146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.704097 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.704498 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-mountpoint-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.704653 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-csi-data-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.705142 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7b04fa40-8401-462e-8fc1-c55dbca89bbc-signing-cabundle\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.705214 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-registration-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.705727 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-service-ca-bundle\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.705840 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-tmpfs\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.706232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b2af699f-f757-4c89-ba00-55f9f8599fda-images\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.706376 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d7e13929-564e-49dd-baab-987ea26c55a3-certs\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.707543 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/494881f5-dbba-48c3-9871-c8d81136eda3-plugins-dir\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.707848 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-config-volume\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.707858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd6215a-2d0b-48c5-be33-130bb55803c7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.708040 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2af699f-f757-4c89-ba00-55f9f8599fda-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.708468 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d53b55b-dbd8-420d-bc00-128f7d5e1580-config\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.709927 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-webhook-cert\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.710353 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4411b16-07f8-4701-ad4f-7645a00e829f-config-volume\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.710445 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-default-certificate\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.711257 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-metrics-certs\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.711400 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1412af2-816f-44a8-9862-b1a86ea6b9bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.711438 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1412af2-816f-44a8-9862-b1a86ea6b9bc-srv-cert\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.713349 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1616614d-03d7-42ee-913f-711b77d1032f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-w956x\" (UID: \"1616614d-03d7-42ee-913f-711b77d1032f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.713372 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxqx9\" (UniqueName: \"kubernetes.io/projected/29e2bc86-ce7e-4abd-93d7-7adf15987e18-kube-api-access-gxqx9\") pod \"openshift-config-operator-7777fb866f-zlt7s\" (UID: \"29e2bc86-ce7e-4abd-93d7-7adf15987e18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.713527 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7b04fa40-8401-462e-8fc1-c55dbca89bbc-signing-key\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.713924 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-metrics-tls\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.715251 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.715740 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93d94603-9462-4c78-9a9f-ee66522eb4cf-metrics-tls\") pod \"dns-operator-744455d44c-g6tmc\" (UID: \"93d94603-9462-4c78-9a9f-ee66522eb4cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.715895 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-apiservice-cert\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.716250 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8d56056-c06c-441c-8936-0416f53f5da0-cert\") pod \"ingress-canary-c78zk\" (UID: \"b8d56056-c06c-441c-8936-0416f53f5da0\") " pod="openshift-ingress-canary/ingress-canary-c78zk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.719353 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1705605-4391-45da-a171-23f5a7e0ff74-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kp59b\" (UID: \"c1705605-4391-45da-a171-23f5a7e0ff74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.719717 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-proxy-tls\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.720789 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d7e13929-564e-49dd-baab-987ea26c55a3-node-bootstrap-token\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.725062 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.725961 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.726229 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.727126 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d53b55b-dbd8-420d-bc00-128f7d5e1580-serving-cert\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.729042 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-stats-auth\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.729920 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mr7wc\" (UID: \"91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.730736 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd568fb3-5f33-4412-a84b-c37d56678927-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fbxhv\" (UID: \"cd568fb3-5f33-4412-a84b-c37d56678927\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.753871 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v8nt\" (UniqueName: \"kubernetes.io/projected/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-kube-api-access-8v8nt\") pod \"console-f9d7485db-g68sq\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.763481 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.772731 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt9nw\" (UniqueName: \"kubernetes.io/projected/f2de169d-9583-46e5-b2ee-da1a6903eafb-kube-api-access-pt9nw\") pod \"downloads-7954f5f757-qzp8b\" (UID: \"f2de169d-9583-46e5-b2ee-da1a6903eafb\") " pod="openshift-console/downloads-7954f5f757-qzp8b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.780461 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.789939 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-bound-sa-token\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.796708 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f67b5"] Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.798349 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:02 crc kubenswrapper[4740]: E1009 10:30:02.798889 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.298871273 +0000 UTC m=+142.261071654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.802517 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qzp8b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.813368 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9tbx\" (UniqueName: \"kubernetes.io/projected/626a56dc-ba4f-4ff6-a787-8f60403b4d42-kube-api-access-d9tbx\") pod \"cluster-image-registry-operator-dc59b4c8b-pjdt4\" (UID: \"626a56dc-ba4f-4ff6-a787-8f60403b4d42\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.823746 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.830580 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.843620 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0daf28c1-6a40-4a53-a196-521d95be9aab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6wv7f\" (UID: \"0daf28c1-6a40-4a53-a196-521d95be9aab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.848655 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.849521 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k89jq\" (UniqueName: \"kubernetes.io/projected/6c338011-b98d-4a6b-b48e-76025b1f0973-kube-api-access-k89jq\") pod \"cluster-samples-operator-665b6dd947-pvppr\" (UID: \"6c338011-b98d-4a6b-b48e-76025b1f0973\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.850354 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x99pn"] Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.871007 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft7tr\" (UniqueName: \"kubernetes.io/projected/58ba2ce5-2051-4631-a4dd-3b8bd96759f8-kube-api-access-ft7tr\") pod \"ingress-operator-5b745b69d9-h4zqk\" (UID: \"58ba2ce5-2051-4631-a4dd-3b8bd96759f8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:02 crc kubenswrapper[4740]: W1009 10:30:02.875402 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63fc8742_b2a1_42a1_b78e_11e736801124.slice/crio-f6b8492ead6520298475cf579470fe39d65f6ab474484dfe8137a206e92c776c WatchSource:0}: Error finding container f6b8492ead6520298475cf579470fe39d65f6ab474484dfe8137a206e92c776c: Status 404 returned error can't find the container with id f6b8492ead6520298475cf579470fe39d65f6ab474484dfe8137a206e92c776c Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.899915 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:02 crc kubenswrapper[4740]: E1009 10:30:02.900401 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.400386638 +0000 UTC m=+142.362587019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.917896 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8ww7\" (UniqueName: \"kubernetes.io/projected/e4411b16-07f8-4701-ad4f-7645a00e829f-kube-api-access-j8ww7\") pod \"collect-profiles-29333430-lmrlf\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.936917 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dsw4\" (UniqueName: \"kubernetes.io/projected/c1705605-4391-45da-a171-23f5a7e0ff74-kube-api-access-5dsw4\") pod \"multus-admission-controller-857f4d67dd-kp59b\" (UID: \"c1705605-4391-45da-a171-23f5a7e0ff74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.955591 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ssz6d"] Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.955812 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.960329 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnvp9\" (UniqueName: \"kubernetes.io/projected/c9c38db8-21e3-495b-b6db-3ea52bec9b5c-kube-api-access-fnvp9\") pod \"router-default-5444994796-vntt9\" (UID: \"c9c38db8-21e3-495b-b6db-3ea52bec9b5c\") " pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.978664 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:02 crc kubenswrapper[4740]: I1009 10:30:02.985334 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxtj\" (UniqueName: \"kubernetes.io/projected/8df53cdb-6c1c-41c2-8b24-5b73c400ced4-kube-api-access-jfxtj\") pod \"packageserver-d55dfcdfc-kgt6j\" (UID: \"8df53cdb-6c1c-41c2-8b24-5b73c400ced4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:02.999470 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6brpw\" (UniqueName: \"kubernetes.io/projected/1616614d-03d7-42ee-913f-711b77d1032f-kube-api-access-6brpw\") pod \"package-server-manager-789f6589d5-w956x\" (UID: \"1616614d-03d7-42ee-913f-711b77d1032f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.001182 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.001543 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.501527532 +0000 UTC m=+142.463727913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.011807 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.016063 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6zqw2"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.022688 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.024071 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s9vv\" (UniqueName: \"kubernetes.io/projected/a1412af2-816f-44a8-9862-b1a86ea6b9bc-kube-api-access-6s9vv\") pod \"olm-operator-6b444d44fb-959ns\" (UID: \"a1412af2-816f-44a8-9862-b1a86ea6b9bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.036120 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdlc9\" (UniqueName: \"kubernetes.io/projected/93d94603-9462-4c78-9a9f-ee66522eb4cf-kube-api-access-hdlc9\") pod \"dns-operator-744455d44c-g6tmc\" (UID: \"93d94603-9462-4c78-9a9f-ee66522eb4cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.051791 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54zft\" (UniqueName: \"kubernetes.io/projected/07b58f91-881e-4c94-96b6-ff6126e39824-kube-api-access-54zft\") pod \"marketplace-operator-79b997595-zxkqd\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.059398 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.067124 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.067554 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.077195 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74sw\" (UniqueName: \"kubernetes.io/projected/b2af699f-f757-4c89-ba00-55f9f8599fda-kube-api-access-n74sw\") pod \"machine-config-operator-74547568cd-t6fqj\" (UID: \"b2af699f-f757-4c89-ba00-55f9f8599fda\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.090427 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-598nr\" (UID: \"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.102233 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.118218 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7bw8\" (UniqueName: \"kubernetes.io/projected/d7e13929-564e-49dd-baab-987ea26c55a3-kube-api-access-d7bw8\") pod \"machine-config-server-jrhp4\" (UID: \"d7e13929-564e-49dd-baab-987ea26c55a3\") " pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.121030 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.121365 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.621353418 +0000 UTC m=+142.583553799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.122478 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.138679 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.143241 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24hmr\" (UniqueName: \"kubernetes.io/projected/b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e-kube-api-access-24hmr\") pod \"dns-default-xhb6x\" (UID: \"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e\") " pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.150976 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.151031 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxzfg"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.156121 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h2wg\" (UniqueName: \"kubernetes.io/projected/9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1-kube-api-access-2h2wg\") pod \"migrator-59844c95c7-qwphn\" (UID: \"9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.161261 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.166600 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" Oct 09 10:30:03 crc kubenswrapper[4740]: W1009 10:30:03.167591 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68bb29c6_1224_44e0_b307_4a2b226288c5.slice/crio-7b4d80ce7d2b3b3ac80ca27aa51047b1a0cc4c9f2095025d1ec5488f7a691261 WatchSource:0}: Error finding container 7b4d80ce7d2b3b3ac80ca27aa51047b1a0cc4c9f2095025d1ec5488f7a691261: Status 404 returned error can't find the container with id 7b4d80ce7d2b3b3ac80ca27aa51047b1a0cc4c9f2095025d1ec5488f7a691261 Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.178431 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fbgt\" (UniqueName: \"kubernetes.io/projected/7b04fa40-8401-462e-8fc1-c55dbca89bbc-kube-api-access-6fbgt\") pod \"service-ca-9c57cc56f-842d9\" (UID: \"7b04fa40-8401-462e-8fc1-c55dbca89bbc\") " pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.189221 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rjl\" (UniqueName: \"kubernetes.io/projected/494881f5-dbba-48c3-9871-c8d81136eda3-kube-api-access-54rjl\") pod \"csi-hostpathplugin-95p7x\" (UID: \"494881f5-dbba-48c3-9871-c8d81136eda3\") " pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:03 crc kubenswrapper[4740]: W1009 10:30:03.197052 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod626a56dc_ba4f_4ff6_a787_8f60403b4d42.slice/crio-8d064091bf9659ed35480757b8af54c18d551d1b7d5729bd219a15e9d21e198e WatchSource:0}: Error finding container 8d064091bf9659ed35480757b8af54c18d551d1b7d5729bd219a15e9d21e198e: Status 404 returned error can't find the container with id 8d064091bf9659ed35480757b8af54c18d551d1b7d5729bd219a15e9d21e198e Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.211695 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4rqb\" (UniqueName: \"kubernetes.io/projected/b8d56056-c06c-441c-8936-0416f53f5da0-kube-api-access-s4rqb\") pod \"ingress-canary-c78zk\" (UID: \"b8d56056-c06c-441c-8936-0416f53f5da0\") " pod="openshift-ingress-canary/ingress-canary-c78zk" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.212826 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.215769 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g68sq"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.223327 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.223452 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.723425307 +0000 UTC m=+142.685625688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.223661 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.225831 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.72581526 +0000 UTC m=+142.688015641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.241894 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.247470 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ts8p\" (UniqueName: \"kubernetes.io/projected/4d53b55b-dbd8-420d-bc00-128f7d5e1580-kube-api-access-4ts8p\") pod \"service-ca-operator-777779d784-cm7ws\" (UID: \"4d53b55b-dbd8-420d-bc00-128f7d5e1580\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.263233 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.266002 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb7sf\" (UniqueName: \"kubernetes.io/projected/91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b-kube-api-access-sb7sf\") pod \"control-plane-machine-set-operator-78cbb6b69f-mr7wc\" (UID: \"91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.275341 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsbm\" (UniqueName: \"kubernetes.io/projected/7d9654ef-644a-4274-b02b-c8eaf9d53a96-kube-api-access-ljsbm\") pod \"catalog-operator-68c6474976-hstzh\" (UID: \"7d9654ef-644a-4274-b02b-c8eaf9d53a96\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.287478 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.294850 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twbl9\" (UniqueName: \"kubernetes.io/projected/4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d-kube-api-access-twbl9\") pod \"machine-config-controller-84d6567774-828hm\" (UID: \"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.298999 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.306865 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.312954 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kp59b"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.314737 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmwhj\" (UniqueName: \"kubernetes.io/projected/2fd6215a-2d0b-48c5-be33-130bb55803c7-kube-api-access-gmwhj\") pod \"kube-storage-version-migrator-operator-b67b599dd-flsgk\" (UID: \"2fd6215a-2d0b-48c5-be33-130bb55803c7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.314980 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.324897 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.325191 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.825176127 +0000 UTC m=+142.787376508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.332123 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jrhp4" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.340009 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-842d9" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.348814 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.396215 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-95p7x" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.402675 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.411928 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c78zk" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.414223 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.434688 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.435136 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:03.935119483 +0000 UTC m=+142.897319864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.456578 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qzp8b"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.472151 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lrpmd"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.472432 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" event={"ID":"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad","Type":"ContainerStarted","Data":"9638a5deb24f05fa2722fbe0c62d1dd065fec6d3f201967fc8bf9963ae2befd9"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.474817 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.490289 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" event={"ID":"db1aed22-417f-47ad-a29d-78effc6ac28d","Type":"ContainerStarted","Data":"c21e7bdac6a8222e8fe54c8908ed065e06ba250821006e243e23c6c493d1a1bd"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.490489 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" event={"ID":"db1aed22-417f-47ad-a29d-78effc6ac28d","Type":"ContainerStarted","Data":"608d3bf11d2553a25bfdbd8717f12c62eaf482ebd7116fea3d8e99269cc7137e"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.508731 4740 generic.go:334] "Generic (PLEG): container finished" podID="833471e4-0651-45ca-aec1-35c2a8a56b5f" containerID="6692db7769202afe4f2a424bec11d543300c87d034b0fc1954173b203361b791" exitCode=0 Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.508866 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" event={"ID":"833471e4-0651-45ca-aec1-35c2a8a56b5f","Type":"ContainerDied","Data":"6692db7769202afe4f2a424bec11d543300c87d034b0fc1954173b203361b791"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.508893 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" event={"ID":"833471e4-0651-45ca-aec1-35c2a8a56b5f","Type":"ContainerStarted","Data":"a32736fda723970c856ba89c914e60626508718d0983fb422950ca012fc21186"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.511092 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-f67b5" event={"ID":"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f","Type":"ContainerStarted","Data":"b2b41fd9b33101531aea4ee0947505bcb9f54ff61bc8e9f5d5e5ff46881192b6"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.511135 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.511149 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-f67b5" event={"ID":"a23bc8cd-dc20-4ade-88cd-1c61d1f6315f","Type":"ContainerStarted","Data":"82eef2f13c3334de1e5d7221ab3b802f260d3340c00a6d45388e42e939f7430a"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.518037 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-f67b5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.518092 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-f67b5" podUID="a23bc8cd-dc20-4ade-88cd-1c61d1f6315f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.522599 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" event={"ID":"63fc8742-b2a1-42a1-b78e-11e736801124","Type":"ContainerStarted","Data":"2bec7c6d0b22b7e8cb9345a07e1b8154d087d5263def2b1522904a168664313b"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.522686 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" event={"ID":"63fc8742-b2a1-42a1-b78e-11e736801124","Type":"ContainerStarted","Data":"f6b8492ead6520298475cf579470fe39d65f6ab474484dfe8137a206e92c776c"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.523563 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.527040 4740 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x99pn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.527089 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" podUID="63fc8742-b2a1-42a1-b78e-11e736801124" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.527553 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" event={"ID":"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12","Type":"ContainerStarted","Data":"bd65859c444c296d83cd086fa68dee4bc1b25e15ecc6c5f45267430205cff843"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.527581 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" event={"ID":"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12","Type":"ContainerStarted","Data":"2188cd1e33b5c13b07cf04b7973563509f31bd2087fc07960de9085cb90f1b80"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.527590 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" event={"ID":"2cfbd3fb-f7f5-4578-9e24-72dbd185cf12","Type":"ContainerStarted","Data":"99894b4c04f408a879a7bf9e4ec9863404beadc8b1d4981bbb5abccc46256ea2"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.529945 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.539231 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.539620 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.039592275 +0000 UTC m=+143.001792706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.539793 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.540697 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.040629893 +0000 UTC m=+143.002830274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.542514 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" event={"ID":"626a56dc-ba4f-4ff6-a787-8f60403b4d42","Type":"ContainerStarted","Data":"8d064091bf9659ed35480757b8af54c18d551d1b7d5729bd219a15e9d21e198e"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.546024 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.552978 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g68sq" event={"ID":"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39","Type":"ContainerStarted","Data":"92ab555905978d118b0186ab5ba1bb24e2f46b62cd1760f6fbd3dd55209938fc"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.556260 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" event={"ID":"974f7f71-f43b-4a14-bac3-567229c728c7","Type":"ContainerStarted","Data":"b77ba1efa03a27515731adcd90a8d02c9840ca8150030b0877f491e7a43671f1"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.557717 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" event={"ID":"6bfff965-33fb-4412-85e1-107e0cf34bf8","Type":"ContainerStarted","Data":"4c5eebe586cc4679342747d69e2920e5b28576e2ddd1471f749cb9d371f398ba"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.557767 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" event={"ID":"6bfff965-33fb-4412-85e1-107e0cf34bf8","Type":"ContainerStarted","Data":"1dfb3317a3563985f879ecb0515046048ca9c1734d606b880f95b4ee34716a39"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.560028 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" event={"ID":"29e2bc86-ce7e-4abd-93d7-7adf15987e18","Type":"ContainerStarted","Data":"3184457c60a1143cab83fda67b4847cf80721ea79d7a9b9a494e078152b1fcb5"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.560731 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" event={"ID":"68bb29c6-1224-44e0-b307-4a2b226288c5","Type":"ContainerStarted","Data":"7b4d80ce7d2b3b3ac80ca27aa51047b1a0cc4c9f2095025d1ec5488f7a691261"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.561499 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" event={"ID":"c1705605-4391-45da-a171-23f5a7e0ff74","Type":"ContainerStarted","Data":"300c0ce69a9a1d3f7201e759bdac680c8e0dcb852a6878267ddfabb3d3b5923d"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.562431 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" event={"ID":"9063b645-eba3-4ba3-a871-23adad70136d","Type":"ContainerStarted","Data":"7cbf21bcab694f2ed43fbf52d6fa51dcf64d46a36e6f18ac24ad4c949df9e215"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.565720 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vntt9" event={"ID":"c9c38db8-21e3-495b-b6db-3ea52bec9b5c","Type":"ContainerStarted","Data":"b3a63909d3e64724d821b2af8a4dc898c42f6385ec47049ff73263cbcab6feeb"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.569253 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" event={"ID":"bd8ebc5c-e47c-4177-968f-a3c924dbda0e","Type":"ContainerStarted","Data":"15561718b75e6eae5960e8d03d73d82a402e7405f3c7b439cdbd71ebb091bdf8"} Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.615008 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.646136 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.649317 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.149289885 +0000 UTC m=+143.111490266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.660515 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.708251 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.715925 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x"] Oct 09 10:30:03 crc kubenswrapper[4740]: W1009 10:30:03.734893 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8df53cdb_6c1c_41c2_8b24_5b73c400ced4.slice/crio-7a12e0b0dcb1162875d85bb6733a48de9af0e50d3e1533d5bf79b2543d8893d5 WatchSource:0}: Error finding container 7a12e0b0dcb1162875d85bb6733a48de9af0e50d3e1533d5bf79b2543d8893d5: Status 404 returned error can't find the container with id 7a12e0b0dcb1162875d85bb6733a48de9af0e50d3e1533d5bf79b2543d8893d5 Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.749731 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.750517 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.25046702 +0000 UTC m=+143.212667401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: W1009 10:30:03.775922 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc2dbb2e_c3ba_4ed2_becc_a6e809fe46a5.slice/crio-9f57e7f4a9dec60e52874a04230d46aa622355f82b20054098c84c9e1690b832 WatchSource:0}: Error finding container 9f57e7f4a9dec60e52874a04230d46aa622355f82b20054098c84c9e1690b832: Status 404 returned error can't find the container with id 9f57e7f4a9dec60e52874a04230d46aa622355f82b20054098c84c9e1690b832 Oct 09 10:30:03 crc kubenswrapper[4740]: W1009 10:30:03.778190 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4411b16_07f8_4701_ad4f_7645a00e829f.slice/crio-cfb65d4d5bfc3ea9d9ce3e7401b180911e73dbfd09d9a7def757777d6d4b196f WatchSource:0}: Error finding container cfb65d4d5bfc3ea9d9ce3e7401b180911e73dbfd09d9a7def757777d6d4b196f: Status 404 returned error can't find the container with id cfb65d4d5bfc3ea9d9ce3e7401b180911e73dbfd09d9a7def757777d6d4b196f Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.789460 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f"] Oct 09 10:30:03 crc kubenswrapper[4740]: W1009 10:30:03.804459 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1616614d_03d7_42ee_913f_711b77d1032f.slice/crio-62e37385a5748ef35a106fdf5bc51a00146e110b25f41517a95bf372d81ff131 WatchSource:0}: Error finding container 62e37385a5748ef35a106fdf5bc51a00146e110b25f41517a95bf372d81ff131: Status 404 returned error can't find the container with id 62e37385a5748ef35a106fdf5bc51a00146e110b25f41517a95bf372d81ff131 Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.848672 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.854141 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.855719 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.355669661 +0000 UTC m=+143.317870042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.885180 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-842d9"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.924679 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.928618 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g6tmc"] Oct 09 10:30:03 crc kubenswrapper[4740]: I1009 10:30:03.957395 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:03 crc kubenswrapper[4740]: E1009 10:30:03.957725 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.457713119 +0000 UTC m=+143.419913500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.059258 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.059741 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.559720586 +0000 UTC m=+143.521920967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.128589 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" podStartSLOduration=123.128571949 podStartE2EDuration="2m3.128571949s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:04.125806247 +0000 UTC m=+143.088006638" watchObservedRunningTime="2025-10-09 10:30:04.128571949 +0000 UTC m=+143.090772330" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.161515 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.161978 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.661963449 +0000 UTC m=+143.624163830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: W1009 10:30:04.183737 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b04fa40_8401_462e_8fc1_c55dbca89bbc.slice/crio-2864d024507ffebd3c846a0c401abc8b627f03285bda373be713f36b8ce0b542 WatchSource:0}: Error finding container 2864d024507ffebd3c846a0c401abc8b627f03285bda373be713f36b8ce0b542: Status 404 returned error can't find the container with id 2864d024507ffebd3c846a0c401abc8b627f03285bda373be713f36b8ce0b542 Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.210204 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2tnv4" podStartSLOduration=123.210187099 podStartE2EDuration="2m3.210187099s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:04.208170596 +0000 UTC m=+143.170370997" watchObservedRunningTime="2025-10-09 10:30:04.210187099 +0000 UTC m=+143.172387480" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.262947 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.263110 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.763086933 +0000 UTC m=+143.725287314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.263524 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.263832 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.763822032 +0000 UTC m=+143.726022413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.365394 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.366107 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.866086956 +0000 UTC m=+143.828287337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.467415 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.468080 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:04.968063482 +0000 UTC m=+143.930263973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.570435 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.570831 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.070817569 +0000 UTC m=+144.033017950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.572930 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-f67b5" podStartSLOduration=123.572916044 podStartE2EDuration="2m3.572916044s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:04.569953976 +0000 UTC m=+143.532154357" watchObservedRunningTime="2025-10-09 10:30:04.572916044 +0000 UTC m=+143.535116435" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.675980 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.676696 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.176681448 +0000 UTC m=+144.138881829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.684096 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" event={"ID":"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5","Type":"ContainerStarted","Data":"9f57e7f4a9dec60e52874a04230d46aa622355f82b20054098c84c9e1690b832"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.697957 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" event={"ID":"e4411b16-07f8-4701-ad4f-7645a00e829f","Type":"ContainerStarted","Data":"cfb65d4d5bfc3ea9d9ce3e7401b180911e73dbfd09d9a7def757777d6d4b196f"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.710733 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" event={"ID":"645bfb4f-f372-4d0a-99da-d6942d8b773c","Type":"ContainerStarted","Data":"fbe334cf74a96d868e4f23c7b9fbb61805a6af1eb4e8fb83b19742e3ba1847f7"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.713956 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vntt9" event={"ID":"c9c38db8-21e3-495b-b6db-3ea52bec9b5c","Type":"ContainerStarted","Data":"6635505eedc4e4faab9261fd825405773622feabafa6ef02837d8b48a2d900be"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.722848 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-842d9" event={"ID":"7b04fa40-8401-462e-8fc1-c55dbca89bbc","Type":"ContainerStarted","Data":"2864d024507ffebd3c846a0c401abc8b627f03285bda373be713f36b8ce0b542"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.729932 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qzp8b" event={"ID":"f2de169d-9583-46e5-b2ee-da1a6903eafb","Type":"ContainerStarted","Data":"9d9c1c21e32ca5db9268762dbc4f0c36a2db9552f82008f471ca16ccdc99f1f0"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.767958 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" event={"ID":"93d94603-9462-4c78-9a9f-ee66522eb4cf","Type":"ContainerStarted","Data":"b293427aa99dc9c4e1bca6e1ee7ef2727042cd6cb75ff058a26fb40e7c0f24c1"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.781653 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.782069 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.282051743 +0000 UTC m=+144.244252224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.818283 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" event={"ID":"58ba2ce5-2051-4631-a4dd-3b8bd96759f8","Type":"ContainerStarted","Data":"d04568073cf51b1f28c6533f4293cfa8d49be81baa8395578dc3f9dbbe8568aa"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.824823 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" event={"ID":"b2af699f-f757-4c89-ba00-55f9f8599fda","Type":"ContainerStarted","Data":"fdf446f46d4f3254ff696df34a82510a17867ecf13503835b96d8031625d4088"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.835945 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wscmf" podStartSLOduration=123.835923492 podStartE2EDuration="2m3.835923492s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:04.809818435 +0000 UTC m=+143.772018816" watchObservedRunningTime="2025-10-09 10:30:04.835923492 +0000 UTC m=+143.798123873" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.838889 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns"] Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.856083 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" event={"ID":"9063b645-eba3-4ba3-a871-23adad70136d","Type":"ContainerStarted","Data":"d940a9a65c725e4b4e9e0b418782e958caa57d3b13d92a265a5b84c31ed76a86"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.857329 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.888615 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:04 crc kubenswrapper[4740]: E1009 10:30:04.889099 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.389084433 +0000 UTC m=+144.351284814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.890558 4740 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5bn7g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.890621 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" podUID="9063b645-eba3-4ba3-a871-23adad70136d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 09 10:30:04 crc kubenswrapper[4740]: W1009 10:30:04.891202 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1412af2_816f_44a8_9862_b1a86ea6b9bc.slice/crio-a0871316d71a87c152cb6785c3c5cec78a5defdc891f158ccbb378daf028bae3 WatchSource:0}: Error finding container a0871316d71a87c152cb6785c3c5cec78a5defdc891f158ccbb378daf028bae3: Status 404 returned error can't find the container with id a0871316d71a87c152cb6785c3c5cec78a5defdc891f158ccbb378daf028bae3 Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.891968 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" event={"ID":"0daf28c1-6a40-4a53-a196-521d95be9aab","Type":"ContainerStarted","Data":"76151013f6efddba327e8b58bdec139ec9613f8ee200226309ce8c83c5152ccc"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.892870 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" event={"ID":"8df53cdb-6c1c-41c2-8b24-5b73c400ced4","Type":"ContainerStarted","Data":"7a12e0b0dcb1162875d85bb6733a48de9af0e50d3e1533d5bf79b2543d8893d5"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.902555 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc"] Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.910836 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" event={"ID":"974f7f71-f43b-4a14-bac3-567229c728c7","Type":"ContainerStarted","Data":"2127a9a81c3d7a495cfe3b0f6927b046ecf778a80e83c9b8cae5bf8fb43efccd"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.920003 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" event={"ID":"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad","Type":"ContainerStarted","Data":"6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.921547 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.930070 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" event={"ID":"1616614d-03d7-42ee-913f-711b77d1032f","Type":"ContainerStarted","Data":"62e37385a5748ef35a106fdf5bc51a00146e110b25f41517a95bf372d81ff131"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.942686 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-95p7x"] Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.960551 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" event={"ID":"6c338011-b98d-4a6b-b48e-76025b1f0973","Type":"ContainerStarted","Data":"ca30e050e3cd88aa807b71242352067b8cc8d8bd772ba671d8ec1aa2b27bc4fd"} Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.979551 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.983366 4740 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6zqw2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" start-of-body= Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.983415 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" podUID="b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.986960 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:04 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:04 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:04 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.993471 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.989791 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk"] Oct 09 10:30:04 crc kubenswrapper[4740]: I1009 10:30:04.995209 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.000697 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.500677812 +0000 UTC m=+144.462878203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.006680 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c78zk"] Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.011966 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh"] Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.012120 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jrhp4" event={"ID":"d7e13929-564e-49dd-baab-987ea26c55a3","Type":"ContainerStarted","Data":"ddda7217f898ae72fa63ccba2a378004b488755d032963d0ef94d94f57bdf76f"} Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.016416 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" event={"ID":"bd8ebc5c-e47c-4177-968f-a3c924dbda0e","Type":"ContainerStarted","Data":"87c6313299ac708e1dc1306dd87c996ae0e3849481942cc944c001cdaad0b040"} Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.034782 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" event={"ID":"cd568fb3-5f33-4412-a84b-c37d56678927","Type":"ContainerStarted","Data":"de1bdb7b14f9e79155dd11017f123e0ce7da9d4ca84d260e7195ebf683e02e8e"} Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.042644 4740 generic.go:334] "Generic (PLEG): container finished" podID="29e2bc86-ce7e-4abd-93d7-7adf15987e18" containerID="fab2d1cc2d3717190bbae6c1ab6985c36e8933a01604eb0d58f1140f8e448f3b" exitCode=0 Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.043434 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" event={"ID":"29e2bc86-ce7e-4abd-93d7-7adf15987e18","Type":"ContainerDied","Data":"fab2d1cc2d3717190bbae6c1ab6985c36e8933a01604eb0d58f1140f8e448f3b"} Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.045827 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn"] Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.061021 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws"] Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.067319 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.070780 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-f67b5" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.094886 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xhb6x"] Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.096506 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.098368 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.598346545 +0000 UTC m=+144.560546926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.100524 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxg57" podStartSLOduration=123.100509962 podStartE2EDuration="2m3.100509962s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:05.087631613 +0000 UTC m=+144.049831994" watchObservedRunningTime="2025-10-09 10:30:05.100509962 +0000 UTC m=+144.062710343" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.162681 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxkqd"] Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.173293 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-828hm"] Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.179158 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-ssz6d" podStartSLOduration=123.179136603 podStartE2EDuration="2m3.179136603s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:05.174324947 +0000 UTC m=+144.136525328" watchObservedRunningTime="2025-10-09 10:30:05.179136603 +0000 UTC m=+144.141336984" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.198724 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.199787 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.699770967 +0000 UTC m=+144.661971348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.213292 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fs6nz" podStartSLOduration=124.213273273 podStartE2EDuration="2m4.213273273s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:05.21203985 +0000 UTC m=+144.174240251" watchObservedRunningTime="2025-10-09 10:30:05.213273273 +0000 UTC m=+144.175473654" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.313056 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.313441 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.813427281 +0000 UTC m=+144.775627662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.333689 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vntt9" podStartSLOduration=123.333675364 podStartE2EDuration="2m3.333675364s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:05.332809251 +0000 UTC m=+144.295009632" watchObservedRunningTime="2025-10-09 10:30:05.333675364 +0000 UTC m=+144.295875745" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.373050 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" podStartSLOduration=123.370383981 podStartE2EDuration="2m3.370383981s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:05.368210214 +0000 UTC m=+144.330410615" watchObservedRunningTime="2025-10-09 10:30:05.370383981 +0000 UTC m=+144.332584362" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.409892 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.409952 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.414309 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.414659 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:05.914640537 +0000 UTC m=+144.876840918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.470846 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" podStartSLOduration=124.470829767 podStartE2EDuration="2m4.470829767s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:05.46941389 +0000 UTC m=+144.431614261" watchObservedRunningTime="2025-10-09 10:30:05.470829767 +0000 UTC m=+144.433030148" Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.517220 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.517596 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.017583739 +0000 UTC m=+144.979784120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.620166 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.620300 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.120271854 +0000 UTC m=+145.082472235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.621444 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.622777 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.122742559 +0000 UTC m=+145.084942940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.734518 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.735732 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.235709815 +0000 UTC m=+145.197910196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.841243 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.843211 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.343192456 +0000 UTC m=+145.305392837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.942535 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:05 crc kubenswrapper[4740]: E1009 10:30:05.942978 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.442963354 +0000 UTC m=+145.405163735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.983686 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:05 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:05 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:05 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:05 crc kubenswrapper[4740]: I1009 10:30:05.983990 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.049386 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.049728 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.549712756 +0000 UTC m=+145.511913137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.059466 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" event={"ID":"626a56dc-ba4f-4ff6-a787-8f60403b4d42","Type":"ContainerStarted","Data":"b2cab764a0a9f1463e9feeef0ab39f9a70ffbf13013c28f353e563b590eff6af"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.060954 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xhb6x" event={"ID":"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e","Type":"ContainerStarted","Data":"7165bae960772d9729866f75e0484e226e5bc34dd5503f21cf84e36ff11be5c0"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.062073 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" event={"ID":"cd568fb3-5f33-4412-a84b-c37d56678927","Type":"ContainerStarted","Data":"766e800144fcbfd535577e599e9491c4966a409a4a262733c263b824ce82f50c"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.065828 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-95p7x" event={"ID":"494881f5-dbba-48c3-9871-c8d81136eda3","Type":"ContainerStarted","Data":"979b2e82c7ba98556e45ca1464856ad476cea8b71d912ceff754a877aab546a8"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.079079 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" event={"ID":"833471e4-0651-45ca-aec1-35c2a8a56b5f","Type":"ContainerStarted","Data":"398f9c24c33bf2dc31848bc2190e20b449a4fd3065dbfaf3a77ee316abac48b3"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.083391 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" event={"ID":"a1412af2-816f-44a8-9862-b1a86ea6b9bc","Type":"ContainerStarted","Data":"5797e6e4ca5da2b35a0256d748e89bb5ceab4cf7f34ad78f9a9404b528269c02"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.083451 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" event={"ID":"a1412af2-816f-44a8-9862-b1a86ea6b9bc","Type":"ContainerStarted","Data":"a0871316d71a87c152cb6785c3c5cec78a5defdc891f158ccbb378daf028bae3"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.087883 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pjdt4" podStartSLOduration=125.08782839 podStartE2EDuration="2m5.08782839s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.086281949 +0000 UTC m=+145.048482340" watchObservedRunningTime="2025-10-09 10:30:06.08782839 +0000 UTC m=+145.050028771" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.089524 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.092259 4740 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-959ns container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.092322 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" podUID="a1412af2-816f-44a8-9862-b1a86ea6b9bc" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.103907 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g68sq" event={"ID":"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39","Type":"ContainerStarted","Data":"88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.111192 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" podStartSLOduration=124.111170115 podStartE2EDuration="2m4.111170115s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.107499808 +0000 UTC m=+145.069700199" watchObservedRunningTime="2025-10-09 10:30:06.111170115 +0000 UTC m=+145.073370506" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.115192 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" event={"ID":"c1705605-4391-45da-a171-23f5a7e0ff74","Type":"ContainerStarted","Data":"3655a4bd2ae783c7e3109ba7b3bc946a51374c4cf257c5c0e0dd90c22a5dcbbf"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.130065 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" event={"ID":"1616614d-03d7-42ee-913f-711b77d1032f","Type":"ContainerStarted","Data":"33eef04eeb292c759e0e9ce3f5fc8480ddc5d94a170c0c3fdc3a6c202e0d5ab8"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.130891 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.133269 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbxhv" podStartSLOduration=124.133251247 podStartE2EDuration="2m4.133251247s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.13110847 +0000 UTC m=+145.093308861" watchObservedRunningTime="2025-10-09 10:30:06.133251247 +0000 UTC m=+145.095451628" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.150485 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" event={"ID":"4d53b55b-dbd8-420d-bc00-128f7d5e1580","Type":"ContainerStarted","Data":"d0d51c4be692e91b6c4546afb18e5e5fac03bc87247c439c424ce63d16a118a8"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.159213 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" event={"ID":"4d53b55b-dbd8-420d-bc00-128f7d5e1580","Type":"ContainerStarted","Data":"31fe90691fc918d99d7aa7b2bce77b537448412d1a8fdb72190692beb9ae79f9"} Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.153021 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.653004987 +0000 UTC m=+145.615205368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.152961 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.163082 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" event={"ID":"645bfb4f-f372-4d0a-99da-d6942d8b773c","Type":"ContainerStarted","Data":"0e94986322991d9b6bdb2ac4b1ea8c3152555765bf81d1c3ca4945a5024cf370"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.166260 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.173127 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.673104096 +0000 UTC m=+145.635304477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.175233 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qzp8b" event={"ID":"f2de169d-9583-46e5-b2ee-da1a6903eafb","Type":"ContainerStarted","Data":"692b1c7a8b8cdd3bd07aaedad6f6f227324d18147815baea908e48af3a0401b2"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.178089 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qzp8b" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.178463 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-qzp8b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.178501 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qzp8b" podUID="f2de169d-9583-46e5-b2ee-da1a6903eafb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.189927 4740 generic.go:334] "Generic (PLEG): container finished" podID="68bb29c6-1224-44e0-b307-4a2b226288c5" containerID="179c8ceb294fc089a2cebcd56d94aa841a5da46bc59d7e2e292331cbd36ca570" exitCode=0 Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.190038 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" event={"ID":"68bb29c6-1224-44e0-b307-4a2b226288c5","Type":"ContainerDied","Data":"179c8ceb294fc089a2cebcd56d94aa841a5da46bc59d7e2e292331cbd36ca570"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.193778 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" event={"ID":"e4411b16-07f8-4701-ad4f-7645a00e829f","Type":"ContainerStarted","Data":"c870180cedd7934f886a0c284ef2c75616503f107aa0fac00c294108d4f2996a"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.207605 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" podStartSLOduration=124.207582135 podStartE2EDuration="2m4.207582135s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.162687992 +0000 UTC m=+145.124888373" watchObservedRunningTime="2025-10-09 10:30:06.207582135 +0000 UTC m=+145.169782516" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.252570 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" event={"ID":"2fd6215a-2d0b-48c5-be33-130bb55803c7","Type":"ContainerStarted","Data":"fede91c82ddc01b3b9a71ebd3f901aeb38f5f8b4d5b0352d40e2bceeeb1d3f29"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.252611 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" event={"ID":"2fd6215a-2d0b-48c5-be33-130bb55803c7","Type":"ContainerStarted","Data":"54b99207e1ea8286d8e3a0e903556e58a989a4dd02270ad55c22cfe7d783a629"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.255572 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cm7ws" podStartSLOduration=124.255550838 podStartE2EDuration="2m4.255550838s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.228025803 +0000 UTC m=+145.190226184" watchObservedRunningTime="2025-10-09 10:30:06.255550838 +0000 UTC m=+145.217751239" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.256307 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-g68sq" podStartSLOduration=125.256297738 podStartE2EDuration="2m5.256297738s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.252019225 +0000 UTC m=+145.214219606" watchObservedRunningTime="2025-10-09 10:30:06.256297738 +0000 UTC m=+145.218498119" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.278328 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.280008 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.77992625 +0000 UTC m=+145.742126671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.289489 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" podStartSLOduration=124.289472602 podStartE2EDuration="2m4.289472602s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.288226499 +0000 UTC m=+145.250426880" watchObservedRunningTime="2025-10-09 10:30:06.289472602 +0000 UTC m=+145.251672983" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.313284 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" event={"ID":"91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b","Type":"ContainerStarted","Data":"183fb4467126357e2ae0f991aaefb2a839ee1b3d1707b7fc62bb253705292687"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.313334 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" event={"ID":"91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b","Type":"ContainerStarted","Data":"f28cc2155b4f031569c259ec41fa9613192bac6b061fe8f48354029493ad707e"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.342395 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flsgk" podStartSLOduration=124.342375285 podStartE2EDuration="2m4.342375285s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.312711414 +0000 UTC m=+145.274911795" watchObservedRunningTime="2025-10-09 10:30:06.342375285 +0000 UTC m=+145.304575666" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.344490 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-lrpmd" podStartSLOduration=125.344476611 podStartE2EDuration="2m5.344476611s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.341911833 +0000 UTC m=+145.304112214" watchObservedRunningTime="2025-10-09 10:30:06.344476611 +0000 UTC m=+145.306676992" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.357945 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" event={"ID":"07b58f91-881e-4c94-96b6-ff6126e39824","Type":"ContainerStarted","Data":"c3dc7acc897d77cafe46c80d73775f6c0613d79766875380b5aefb2dff390eb7"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.357993 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" event={"ID":"07b58f91-881e-4c94-96b6-ff6126e39824","Type":"ContainerStarted","Data":"045b5e79b53f8c78bdca56f4d17c503ab7b5327861294556149ff41f7b3e6438"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.358842 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.361827 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zxkqd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.361871 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" podUID="07b58f91-881e-4c94-96b6-ff6126e39824" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.382768 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.383132 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.883118559 +0000 UTC m=+145.845318940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.385435 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" event={"ID":"58ba2ce5-2051-4631-a4dd-3b8bd96759f8","Type":"ContainerStarted","Data":"2431e624129a416c8f48c3d9aede8b67db0b8e4bff2a35c216ef08b6aa946f28"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.385468 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" event={"ID":"58ba2ce5-2051-4631-a4dd-3b8bd96759f8","Type":"ContainerStarted","Data":"0498e34400cef441363704c3cad28d8c703cfa4a45331b527015c0dfac8a47f1"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.391202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" event={"ID":"dc2dbb2e-c3ba-4ed2-becc-a6e809fe46a5","Type":"ContainerStarted","Data":"97ad0f41a24f948cbf29c1a691f8a84e8fbb45c84fe80c582c6b434dbe3d18c0"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.437168 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" event={"ID":"7d9654ef-644a-4274-b02b-c8eaf9d53a96","Type":"ContainerStarted","Data":"e3e5184a3a6279e6fb2d8fc0ffd44406839d16e877b2e2774a5620467bbc0292"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.437219 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" event={"ID":"7d9654ef-644a-4274-b02b-c8eaf9d53a96","Type":"ContainerStarted","Data":"1cd0d579034c137da1c8ed6dc1a98619918faf84da406d2c2ef22e87db6d637a"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.438195 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.439768 4740 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hstzh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.439816 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" podUID="7d9654ef-644a-4274-b02b-c8eaf9d53a96" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.455860 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qzp8b" podStartSLOduration=125.455841044 podStartE2EDuration="2m5.455841044s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.427899358 +0000 UTC m=+145.390099739" watchObservedRunningTime="2025-10-09 10:30:06.455841044 +0000 UTC m=+145.418041435" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.462179 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" event={"ID":"0daf28c1-6a40-4a53-a196-521d95be9aab","Type":"ContainerStarted","Data":"918bcb46528b28ce243322a9bad531e3ac7bb7693e29e8d42c8c12fa58340f7c"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.464464 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" podStartSLOduration=6.464440891 podStartE2EDuration="6.464440891s" podCreationTimestamp="2025-10-09 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.459659635 +0000 UTC m=+145.421860016" watchObservedRunningTime="2025-10-09 10:30:06.464440891 +0000 UTC m=+145.426641272" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.497439 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.498562 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:06.998546939 +0000 UTC m=+145.960747320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.532290 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h4zqk" podStartSLOduration=124.532270428 podStartE2EDuration="2m4.532270428s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.53123471 +0000 UTC m=+145.493435091" watchObservedRunningTime="2025-10-09 10:30:06.532270428 +0000 UTC m=+145.494470809" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.535129 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" event={"ID":"6c338011-b98d-4a6b-b48e-76025b1f0973","Type":"ContainerStarted","Data":"866014e02d58b2748ac85086755b518af2813dac3ac51f522ef5d0ab03ef4b0e"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.535182 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" event={"ID":"6c338011-b98d-4a6b-b48e-76025b1f0973","Type":"ContainerStarted","Data":"015f8a2e4efa8653e07e65309c644b7bdcf391e95ff65318398d81587ff724db"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.563186 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" event={"ID":"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d","Type":"ContainerStarted","Data":"55bbe587ef3e5f5cf06a3f7913abe36bb781750af9cd4a176d1fc9a4d124c632"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.563503 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" event={"ID":"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d","Type":"ContainerStarted","Data":"04c7d13d13c0c9d3a693f8fa5726d115ef7b26d12cc529f814f491e005c5efbd"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.595214 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" event={"ID":"93d94603-9462-4c78-9a9f-ee66522eb4cf","Type":"ContainerStarted","Data":"8ec48718229572b553ad3372d8bdf6f399170669c6d2d29cb0439acd6e7b2749"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.595933 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" podStartSLOduration=124.595916724 podStartE2EDuration="2m4.595916724s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.595847402 +0000 UTC m=+145.558047783" watchObservedRunningTime="2025-10-09 10:30:06.595916724 +0000 UTC m=+145.558117095" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.596525 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mr7wc" podStartSLOduration=124.59651767 podStartE2EDuration="2m4.59651767s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.55705195 +0000 UTC m=+145.519252331" watchObservedRunningTime="2025-10-09 10:30:06.59651767 +0000 UTC m=+145.558718051" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.601536 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.604274 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.104251744 +0000 UTC m=+146.066452185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.611617 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" event={"ID":"8df53cdb-6c1c-41c2-8b24-5b73c400ced4","Type":"ContainerStarted","Data":"5a643af8319983082e55e1cc38a7fd1ef3932077faca66eb0c33163966c57029"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.612503 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.621901 4740 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kgt6j container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.621963 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" podUID="8df53cdb-6c1c-41c2-8b24-5b73c400ced4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.630188 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-842d9" event={"ID":"7b04fa40-8401-462e-8fc1-c55dbca89bbc","Type":"ContainerStarted","Data":"3fee32c136e7f67ddecfc1e1c0328ee78381844faa8ade665551e3db124519de"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.634426 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c78zk" event={"ID":"b8d56056-c06c-441c-8936-0416f53f5da0","Type":"ContainerStarted","Data":"a0b03bed9293ca5d1adb34be328b4d47334da544c371de440d34c74e6994c2f4"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.634468 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c78zk" event={"ID":"b8d56056-c06c-441c-8936-0416f53f5da0","Type":"ContainerStarted","Data":"e8003028eeddb83b4bcdaec4e2295493739a43d0585e63009e025a0a03bafd17"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.657284 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-598nr" podStartSLOduration=124.65726143 podStartE2EDuration="2m4.65726143s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.622082243 +0000 UTC m=+145.584282624" watchObservedRunningTime="2025-10-09 10:30:06.65726143 +0000 UTC m=+145.619461821" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.679289 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" event={"ID":"29e2bc86-ce7e-4abd-93d7-7adf15987e18","Type":"ContainerStarted","Data":"d37b3619d0b69f4e038f451f37a56f5f3ad0e2e7614a1bf5ff073e35beef8912"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.679283 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-842d9" podStartSLOduration=124.67926781 podStartE2EDuration="2m4.67926781s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.678597632 +0000 UTC m=+145.640798003" watchObservedRunningTime="2025-10-09 10:30:06.67926781 +0000 UTC m=+145.641468191" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.679463 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" podStartSLOduration=124.679456615 podStartE2EDuration="2m4.679456615s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.661344238 +0000 UTC m=+145.623544619" watchObservedRunningTime="2025-10-09 10:30:06.679456615 +0000 UTC m=+145.641656996" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.679845 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.704146 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.705187 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.205169962 +0000 UTC m=+146.167370343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.705294 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" event={"ID":"b2af699f-f757-4c89-ba00-55f9f8599fda","Type":"ContainerStarted","Data":"2ee4f5d52616d9cc01b562aacca6587b72af971efb7e7fd65ed79d6b41385741"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.715258 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" event={"ID":"9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1","Type":"ContainerStarted","Data":"cb6eb90e7de0f7bb18bde121d015d948de9fd8cb5c1ac46d409f8ce3003547ec"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.715299 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" event={"ID":"9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1","Type":"ContainerStarted","Data":"9eda0bf6e8e83ae31d5d1370c4caa085bd64d82aaafb842519956fd0a12cee34"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.760550 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6wv7f" podStartSLOduration=124.760536091 podStartE2EDuration="2m4.760536091s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.723010602 +0000 UTC m=+145.685210993" watchObservedRunningTime="2025-10-09 10:30:06.760536091 +0000 UTC m=+145.722736472" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.762137 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jrhp4" event={"ID":"d7e13929-564e-49dd-baab-987ea26c55a3","Type":"ContainerStarted","Data":"0c663150da409b75029edca95eb916b884d2f9d384983699ae76fa72d29c5893"} Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.778618 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.779409 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.800305 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" podStartSLOduration=124.800202135 podStartE2EDuration="2m4.800202135s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.761596958 +0000 UTC m=+145.723797359" watchObservedRunningTime="2025-10-09 10:30:06.800202135 +0000 UTC m=+145.762402516" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.802110 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" podStartSLOduration=124.802102995 podStartE2EDuration="2m4.802102995s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.791389723 +0000 UTC m=+145.753590114" watchObservedRunningTime="2025-10-09 10:30:06.802102995 +0000 UTC m=+145.764303376" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.807256 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.809996 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.309976333 +0000 UTC m=+146.272176804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.855969 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" podStartSLOduration=124.855949964 podStartE2EDuration="2m4.855949964s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.817953573 +0000 UTC m=+145.780153964" watchObservedRunningTime="2025-10-09 10:30:06.855949964 +0000 UTC m=+145.818150345" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.856272 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvppr" podStartSLOduration=125.856268312 podStartE2EDuration="2m5.856268312s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.852233476 +0000 UTC m=+145.814433857" watchObservedRunningTime="2025-10-09 10:30:06.856268312 +0000 UTC m=+145.818468693" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.895652 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-c78zk" podStartSLOduration=6.895635819 podStartE2EDuration="6.895635819s" podCreationTimestamp="2025-10-09 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.894603962 +0000 UTC m=+145.856804343" watchObservedRunningTime="2025-10-09 10:30:06.895635819 +0000 UTC m=+145.857836200" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.911655 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:06 crc kubenswrapper[4740]: E1009 10:30:06.912174 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.412159045 +0000 UTC m=+146.374359416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.966176 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" podStartSLOduration=124.966161637 podStartE2EDuration="2m4.966161637s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.965941641 +0000 UTC m=+145.928142022" watchObservedRunningTime="2025-10-09 10:30:06.966161637 +0000 UTC m=+145.928362018" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.968734 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" podStartSLOduration=125.968719594 podStartE2EDuration="2m5.968719594s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:06.91844257 +0000 UTC m=+145.880642951" watchObservedRunningTime="2025-10-09 10:30:06.968719594 +0000 UTC m=+145.930919975" Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.982928 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:06 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:06 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:06 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:06 crc kubenswrapper[4740]: I1009 10:30:06.982978 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.022252 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.022909 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.522888541 +0000 UTC m=+146.485088932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.038227 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jrhp4" podStartSLOduration=7.038209445 podStartE2EDuration="7.038209445s" podCreationTimestamp="2025-10-09 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:07.037356753 +0000 UTC m=+145.999557134" watchObservedRunningTime="2025-10-09 10:30:07.038209445 +0000 UTC m=+146.000409826" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.125305 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.125455 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.625432653 +0000 UTC m=+146.587633034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.125488 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.125846 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.625833123 +0000 UTC m=+146.588033504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.184236 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" podStartSLOduration=125.184221671 podStartE2EDuration="2m5.184221671s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:07.183098312 +0000 UTC m=+146.145298693" watchObservedRunningTime="2025-10-09 10:30:07.184221671 +0000 UTC m=+146.146422042" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.226852 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.227169 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.727150942 +0000 UTC m=+146.689351323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.324941 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.325177 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.329371 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.329685 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.829675153 +0000 UTC m=+146.791875534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.430790 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.430964 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.930937689 +0000 UTC m=+146.893138070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.431079 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.431381 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:07.931373301 +0000 UTC m=+146.893573682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.532105 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.532290 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.032265458 +0000 UTC m=+146.994465839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.532540 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.532827 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.032820493 +0000 UTC m=+146.995020874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.633234 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.633449 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.133419353 +0000 UTC m=+147.095619844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.633567 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.633892 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.133884205 +0000 UTC m=+147.096084586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.734289 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.734503 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.234470695 +0000 UTC m=+147.196671076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.734563 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.734918 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.234903006 +0000 UTC m=+147.197103477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.766920 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" event={"ID":"c1705605-4391-45da-a171-23f5a7e0ff74","Type":"ContainerStarted","Data":"e6da92d4af0c65edd48cda0f56036c0437f6170ff7f2b6c668b3ca10729586cc"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.771149 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" event={"ID":"68bb29c6-1224-44e0-b307-4a2b226288c5","Type":"ContainerStarted","Data":"22740e58dfbe0083c20377dc3eef765d50c23429f54fb951fe30b18b701ce38f"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.771209 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" event={"ID":"68bb29c6-1224-44e0-b307-4a2b226288c5","Type":"ContainerStarted","Data":"7c5d4081a69b563f87c77c60fd23b97d599348b6a71186b533ee14c2baffe276"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.773551 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" event={"ID":"1616614d-03d7-42ee-913f-711b77d1032f","Type":"ContainerStarted","Data":"2a6c14986ef9c12a830838825fdb5f36db26dde65d62efe609d3c3591941edf7"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.776266 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwphn" event={"ID":"9ab9ab18-43b3-4138-a7a9-ee90d3abe1e1","Type":"ContainerStarted","Data":"74b62506cefac80dd77a272a1ff320138626f1fc48e400a75b5e58eafeb5a26d"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.779342 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6fqj" event={"ID":"b2af699f-f757-4c89-ba00-55f9f8599fda","Type":"ContainerStarted","Data":"6da6bd9b2bf5f59187baa32d94788d391a44900032e1fe602fdd2f06bc1aa0ab"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.781377 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xhb6x" event={"ID":"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e","Type":"ContainerStarted","Data":"f7c139b9cdffaf636013778a3ddfac20bbe34ef24494b53d1789b602fa2e9128"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.781420 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xhb6x" event={"ID":"b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e","Type":"ContainerStarted","Data":"bf04fa29be2f53cf61d60b43585def62cc1011185012b3f30d3feb9457e59e79"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.781926 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.783204 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-95p7x" event={"ID":"494881f5-dbba-48c3-9871-c8d81136eda3","Type":"ContainerStarted","Data":"a180c49c38a4138e636eabf45f92721cd88eab0f134825528c174ac7bcf895f2"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.785122 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-828hm" event={"ID":"4c0cc7fa-dbfa-417d-bdcd-eb9dfb10a67d","Type":"ContainerStarted","Data":"d5771d98df48af74bed6303cd63ac1e094553a68d27749b20d338823efe52537"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.790913 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g6tmc" event={"ID":"93d94603-9462-4c78-9a9f-ee66522eb4cf","Type":"ContainerStarted","Data":"8abf92162ce86ed6fcc98675b4f7d067d414387709ce42a2fa522c766d79dfa7"} Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.795183 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zxkqd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.795236 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" podUID="07b58f91-881e-4c94-96b6-ff6126e39824" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.795382 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-qzp8b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.795406 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qzp8b" podUID="f2de169d-9583-46e5-b2ee-da1a6903eafb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.823365 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-959ns" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.837474 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.837615 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.337588251 +0000 UTC m=+147.299788632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.837881 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.839275 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.339258005 +0000 UTC m=+147.301458496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.854226 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hstzh" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.880271 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kp59b" podStartSLOduration=125.880250925 podStartE2EDuration="2m5.880250925s" podCreationTimestamp="2025-10-09 10:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:07.840372375 +0000 UTC m=+146.802572746" watchObservedRunningTime="2025-10-09 10:30:07.880250925 +0000 UTC m=+146.842451316" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.880697 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xhb6x" podStartSLOduration=7.880688327 podStartE2EDuration="7.880688327s" podCreationTimestamp="2025-10-09 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:07.878638813 +0000 UTC m=+146.840839194" watchObservedRunningTime="2025-10-09 10:30:07.880688327 +0000 UTC m=+146.842888708" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.938975 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:07 crc kubenswrapper[4740]: E1009 10:30:07.940243 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.440217215 +0000 UTC m=+147.402417596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.983380 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:07 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:07 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:07 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.983452 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:07 crc kubenswrapper[4740]: I1009 10:30:07.990162 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" podStartSLOduration=126.99014135 podStartE2EDuration="2m6.99014135s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:07.977264401 +0000 UTC m=+146.939464782" watchObservedRunningTime="2025-10-09 10:30:07.99014135 +0000 UTC m=+146.952341731" Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.040649 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.042021 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.042456 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.542443077 +0000 UTC m=+147.504643458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.143028 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.143310 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.643289894 +0000 UTC m=+147.605490275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.143438 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.143741 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.643731266 +0000 UTC m=+147.605931647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.159479 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgt6j" Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.244321 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.244480 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.744457329 +0000 UTC m=+147.706657720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.244560 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.244877 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.74486604 +0000 UTC m=+147.707066421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.345395 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.345550 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.845518361 +0000 UTC m=+147.807718752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.345640 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.346021 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.846011604 +0000 UTC m=+147.808211995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.446770 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.446926 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.946903052 +0000 UTC m=+147.909103433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.446987 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.447345 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:08.947335203 +0000 UTC m=+147.909535644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.548213 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.548422 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.048398195 +0000 UTC m=+148.010598576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.548698 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.549089 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.049076843 +0000 UTC m=+148.011277214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.650205 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.650303 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.150282629 +0000 UTC m=+148.112483030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.650740 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.651000 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.150992258 +0000 UTC m=+148.113192629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.751564 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.751741 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.251716461 +0000 UTC m=+148.213916842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.751876 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.752209 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.252196574 +0000 UTC m=+148.214396945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.794820 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zxkqd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.794870 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" podUID="07b58f91-881e-4c94-96b6-ff6126e39824" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.794820 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-qzp8b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.794942 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qzp8b" podUID="f2de169d-9583-46e5-b2ee-da1a6903eafb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.801539 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6srf8" Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.852588 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.852717 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.352698311 +0000 UTC m=+148.314898682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.853361 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.856522 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.356485201 +0000 UTC m=+148.318685662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.881760 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zlt7s" Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.954188 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:08 crc kubenswrapper[4740]: E1009 10:30:08.954546 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.454531884 +0000 UTC m=+148.416732265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.982099 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:08 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:08 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:08 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:08 crc kubenswrapper[4740]: I1009 10:30:08.982154 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.056069 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.056435 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.556418198 +0000 UTC m=+148.518618579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.157105 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.157294 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.657269584 +0000 UTC m=+148.619469965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.157723 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.158021 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.658011944 +0000 UTC m=+148.620212325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.258806 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.258998 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.758969953 +0000 UTC m=+148.721170344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.259294 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.259722 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.759704523 +0000 UTC m=+148.721904904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.360136 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.360332 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.860308253 +0000 UTC m=+148.822508634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.360386 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.360703 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.860695543 +0000 UTC m=+148.822895924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.462043 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.462184 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.962154286 +0000 UTC m=+148.924354667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.462350 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.462728 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:09.962721361 +0000 UTC m=+148.924921742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.498062 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jckll"] Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.499007 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.501033 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.513931 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jckll"] Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.564116 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.564305 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:10.064280056 +0000 UTC m=+149.026480427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.564422 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-utilities\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.564477 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-catalog-content\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.564526 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.564583 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzgkt\" (UniqueName: \"kubernetes.io/projected/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-kube-api-access-vzgkt\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.564841 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:10.06482582 +0000 UTC m=+149.027026271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.666136 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.666465 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.666507 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-utilities\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.666534 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-catalog-content\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.666590 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzgkt\" (UniqueName: \"kubernetes.io/projected/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-kube-api-access-vzgkt\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.666616 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.666644 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.666673 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.667214 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:10.167184926 +0000 UTC m=+149.129385327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.669439 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-utilities\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.669791 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-catalog-content\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.672661 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.675999 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.677552 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.679239 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.698715 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lrbjh"] Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.699614 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.707128 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.713390 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzgkt\" (UniqueName: \"kubernetes.io/projected/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-kube-api-access-vzgkt\") pod \"certified-operators-jckll\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.731733 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrbjh"] Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.767833 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-utilities\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.767895 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-catalog-content\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.767923 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnlgh\" (UniqueName: \"kubernetes.io/projected/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-kube-api-access-vnlgh\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.767971 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.768342 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:10.268325041 +0000 UTC m=+149.230525422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.768648 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.776154 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.804661 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-95p7x" event={"ID":"494881f5-dbba-48c3-9871-c8d81136eda3","Type":"ContainerStarted","Data":"dab2136f2bbafb7b0f6a4665e1b3df7cc8c18f722a31099c3757a19297024087"} Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.804715 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-95p7x" event={"ID":"494881f5-dbba-48c3-9871-c8d81136eda3","Type":"ContainerStarted","Data":"26c509fea1d37c8fe1ed1cdec50f4d56fe30fa250752096ee662e09ce4f5b235"} Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.811154 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.830918 4740 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.867694 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.868477 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.868656 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-utilities\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.868806 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-catalog-content\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.868831 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnlgh\" (UniqueName: \"kubernetes.io/projected/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-kube-api-access-vnlgh\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.869524 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-utilities\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.869634 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 10:30:10.369614719 +0000 UTC m=+149.331815100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.875146 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-catalog-content\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.889447 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnlgh\" (UniqueName: \"kubernetes.io/projected/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-kube-api-access-vnlgh\") pod \"community-operators-lrbjh\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.901234 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-59flj"] Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.902354 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.913474 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59flj"] Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.946596 4740 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-09T10:30:09.83094939Z","Handler":null,"Name":""} Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.970034 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-catalog-content\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.970159 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-utilities\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.970220 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.970261 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snkxt\" (UniqueName: \"kubernetes.io/projected/12c19a62-c9c0-4895-923b-4ac55e0f7c90-kube-api-access-snkxt\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:09 crc kubenswrapper[4740]: E1009 10:30:09.970509 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 10:30:10.470493656 +0000 UTC m=+149.432694037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5pc6m" (UID: "513aa088-5f0d-479a-9668-e8ae80738297") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.974574 4740 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.974616 4740 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.987162 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:09 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:09 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:09 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:09 crc kubenswrapper[4740]: I1009 10:30:09.987219 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.046182 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.072001 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.072202 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-utilities\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.072261 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snkxt\" (UniqueName: \"kubernetes.io/projected/12c19a62-c9c0-4895-923b-4ac55e0f7c90-kube-api-access-snkxt\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.072285 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-catalog-content\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.072673 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-catalog-content\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.072916 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-utilities\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.076582 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.099514 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c6t4k"] Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.100645 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.104403 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snkxt\" (UniqueName: \"kubernetes.io/projected/12c19a62-c9c0-4895-923b-4ac55e0f7c90-kube-api-access-snkxt\") pod \"certified-operators-59flj\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.118514 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c6t4k"] Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.121004 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jckll"] Oct 09 10:30:10 crc kubenswrapper[4740]: W1009 10:30:10.152288 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode86840e3_2c55_417d_9fa9_6eccaa01ad1a.slice/crio-191158acb51c6bb5ad54881d0da3f1126fd8a9ac6323a725fd243b11e945bb8e WatchSource:0}: Error finding container 191158acb51c6bb5ad54881d0da3f1126fd8a9ac6323a725fd243b11e945bb8e: Status 404 returned error can't find the container with id 191158acb51c6bb5ad54881d0da3f1126fd8a9ac6323a725fd243b11e945bb8e Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.174636 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t8gm\" (UniqueName: \"kubernetes.io/projected/ad2674eb-8f79-42a7-8d74-906279aaea2c-kube-api-access-5t8gm\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.174717 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.174777 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-utilities\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.174801 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-catalog-content\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.182501 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.182542 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.221964 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5pc6m\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.224819 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.277165 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t8gm\" (UniqueName: \"kubernetes.io/projected/ad2674eb-8f79-42a7-8d74-906279aaea2c-kube-api-access-5t8gm\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.277239 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-utilities\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.277266 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-catalog-content\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.277925 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-utilities\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.294972 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.297678 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-catalog-content\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.297974 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t8gm\" (UniqueName: \"kubernetes.io/projected/ad2674eb-8f79-42a7-8d74-906279aaea2c-kube-api-access-5t8gm\") pod \"community-operators-c6t4k\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.315896 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrbjh"] Oct 09 10:30:10 crc kubenswrapper[4740]: W1009 10:30:10.394339 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3267f79_181b_4b3e_b0c6_eba2901bf0cc.slice/crio-860c0f4878fdd4a94ddbb3b3cc062cb967163803b08ebba5f98f74eca2e47c9a WatchSource:0}: Error finding container 860c0f4878fdd4a94ddbb3b3cc062cb967163803b08ebba5f98f74eca2e47c9a: Status 404 returned error can't find the container with id 860c0f4878fdd4a94ddbb3b3cc062cb967163803b08ebba5f98f74eca2e47c9a Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.434456 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.663958 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5pc6m"] Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.693652 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59flj"] Oct 09 10:30:10 crc kubenswrapper[4740]: W1009 10:30:10.722350 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c19a62_c9c0_4895_923b_4ac55e0f7c90.slice/crio-ddc80bfe353586c6590adbc9fafcd6f2a197a771e754550224b6c732a5067c9c WatchSource:0}: Error finding container ddc80bfe353586c6590adbc9fafcd6f2a197a771e754550224b6c732a5067c9c: Status 404 returned error can't find the container with id ddc80bfe353586c6590adbc9fafcd6f2a197a771e754550224b6c732a5067c9c Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.728228 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c6t4k"] Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.823616 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t4k" event={"ID":"ad2674eb-8f79-42a7-8d74-906279aaea2c","Type":"ContainerStarted","Data":"571bcde2c9dc4fb9608363fcb9b299dc73503fe8f962fca93113ad9955c56cc3"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.824828 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" event={"ID":"513aa088-5f0d-479a-9668-e8ae80738297","Type":"ContainerStarted","Data":"7e1e75bc2b3af83f830dff506cb187a6d2e3296d5fd08e32d3e882ae2179a978"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.825533 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrbjh" event={"ID":"b3267f79-181b-4b3e-b0c6-eba2901bf0cc","Type":"ContainerStarted","Data":"860c0f4878fdd4a94ddbb3b3cc062cb967163803b08ebba5f98f74eca2e47c9a"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.826242 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59flj" event={"ID":"12c19a62-c9c0-4895-923b-4ac55e0f7c90","Type":"ContainerStarted","Data":"ddc80bfe353586c6590adbc9fafcd6f2a197a771e754550224b6c732a5067c9c"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.831523 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d9b6fcac2d45abb7b060df4ddb48885642817ad97e711984c64582ccd4fbb081"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.833065 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-95p7x" event={"ID":"494881f5-dbba-48c3-9871-c8d81136eda3","Type":"ContainerStarted","Data":"d32d82d59733e4706a41395d667dc99434598efe2cb65ac8a10af6346b33c343"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.836504 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jckll" event={"ID":"e86840e3-2c55-417d-9fa9-6eccaa01ad1a","Type":"ContainerStarted","Data":"191158acb51c6bb5ad54881d0da3f1126fd8a9ac6323a725fd243b11e945bb8e"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.838383 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ef15fdeaa88b63ddf665466364bcd021e07f9ed9de37a121e40142852b2363dc"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.846467 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"974ed7dfb54c63fe9cd8a11818a863c0e15c2d3f3a1bcd37508df5a43bb4b7c8"} Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.855095 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-95p7x" podStartSLOduration=10.855073248 podStartE2EDuration="10.855073248s" podCreationTimestamp="2025-10-09 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:10.850442386 +0000 UTC m=+149.812642787" watchObservedRunningTime="2025-10-09 10:30:10.855073248 +0000 UTC m=+149.817273629" Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.982645 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:10 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:10 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:10 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:10 crc kubenswrapper[4740]: I1009 10:30:10.982983 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.693243 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l22jb"] Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.694159 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.696131 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.703814 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l22jb"] Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.760976 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.837515 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf9rd\" (UniqueName: \"kubernetes.io/projected/13877acf-3046-4702-983c-5a3fc856477c-kube-api-access-jf9rd\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.837575 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-catalog-content\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.837652 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-utilities\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.851962 4740 generic.go:334] "Generic (PLEG): container finished" podID="e4411b16-07f8-4701-ad4f-7645a00e829f" containerID="c870180cedd7934f886a0c284ef2c75616503f107aa0fac00c294108d4f2996a" exitCode=0 Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.852045 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" event={"ID":"e4411b16-07f8-4701-ad4f-7645a00e829f","Type":"ContainerDied","Data":"c870180cedd7934f886a0c284ef2c75616503f107aa0fac00c294108d4f2996a"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.854085 4740 generic.go:334] "Generic (PLEG): container finished" podID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerID="a6b7ab4b98439f5d1fc0829847983ba8c38a56db73ec1b0b4a6e6f015f03b71f" exitCode=0 Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.854181 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrbjh" event={"ID":"b3267f79-181b-4b3e-b0c6-eba2901bf0cc","Type":"ContainerDied","Data":"a6b7ab4b98439f5d1fc0829847983ba8c38a56db73ec1b0b4a6e6f015f03b71f"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.855782 4740 generic.go:334] "Generic (PLEG): container finished" podID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerID="05540e0826659e9bf1e7a3e53a0579345fb517d509f5594b55b7cbdfab42dab4" exitCode=0 Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.855850 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59flj" event={"ID":"12c19a62-c9c0-4895-923b-4ac55e0f7c90","Type":"ContainerDied","Data":"05540e0826659e9bf1e7a3e53a0579345fb517d509f5594b55b7cbdfab42dab4"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.856104 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.857358 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"384ec6c5a044ff0596cddd815b37fedf144fcf8e390fdfd7126223b083c8130d"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.859140 4740 generic.go:334] "Generic (PLEG): container finished" podID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerID="3dcf1f9127cec8fdb8f882fa4877d383e0101d6a1342536db9c46cb11a31903c" exitCode=0 Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.859208 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t4k" event={"ID":"ad2674eb-8f79-42a7-8d74-906279aaea2c","Type":"ContainerDied","Data":"3dcf1f9127cec8fdb8f882fa4877d383e0101d6a1342536db9c46cb11a31903c"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.861034 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" event={"ID":"513aa088-5f0d-479a-9668-e8ae80738297","Type":"ContainerStarted","Data":"c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.862850 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.864461 4740 generic.go:334] "Generic (PLEG): container finished" podID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerID="08b90332ebc889f0f116ac85a82504bae3ff37eecf9ec9354cccc18212528473" exitCode=0 Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.864555 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jckll" event={"ID":"e86840e3-2c55-417d-9fa9-6eccaa01ad1a","Type":"ContainerDied","Data":"08b90332ebc889f0f116ac85a82504bae3ff37eecf9ec9354cccc18212528473"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.866044 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e9413d186574bd764d8202fbe3c089d2e40141baddea3d0c4b294086c835d6dd"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.868351 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"15300d30713257306387cd2712a5d77b538e08e378dd3fe7dc63074a2499bc26"} Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.868398 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.945294 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-utilities\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.945529 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf9rd\" (UniqueName: \"kubernetes.io/projected/13877acf-3046-4702-983c-5a3fc856477c-kube-api-access-jf9rd\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.945605 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-catalog-content\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.946709 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-catalog-content\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.947374 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-utilities\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.962865 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" podStartSLOduration=130.962846358 podStartE2EDuration="2m10.962846358s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:11.942543133 +0000 UTC m=+150.904743524" watchObservedRunningTime="2025-10-09 10:30:11.962846358 +0000 UTC m=+150.925046739" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.972463 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf9rd\" (UniqueName: \"kubernetes.io/projected/13877acf-3046-4702-983c-5a3fc856477c-kube-api-access-jf9rd\") pod \"redhat-marketplace-l22jb\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.996678 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:11 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:11 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:11 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:11 crc kubenswrapper[4740]: I1009 10:30:11.996743 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.007255 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.037746 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.048697 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.048782 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.056871 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.057305 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.129230 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mpnqw"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.130727 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.144011 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpnqw"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.147777 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e1acaf9-1d30-4def-ab26-b58667065c61-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e1acaf9-1d30-4def-ab26-b58667065c61\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.147822 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e1acaf9-1d30-4def-ab26-b58667065c61-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e1acaf9-1d30-4def-ab26-b58667065c61\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.244326 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l22jb"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.248945 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e1acaf9-1d30-4def-ab26-b58667065c61-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e1acaf9-1d30-4def-ab26-b58667065c61\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.249233 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txz6h\" (UniqueName: \"kubernetes.io/projected/f6917089-254a-407e-ab54-a8085317ff82-kube-api-access-txz6h\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.249293 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-catalog-content\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.249316 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-utilities\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.249339 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e1acaf9-1d30-4def-ab26-b58667065c61-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e1acaf9-1d30-4def-ab26-b58667065c61\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.249660 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e1acaf9-1d30-4def-ab26-b58667065c61-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e1acaf9-1d30-4def-ab26-b58667065c61\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.270222 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e1acaf9-1d30-4def-ab26-b58667065c61-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e1acaf9-1d30-4def-ab26-b58667065c61\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.350940 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txz6h\" (UniqueName: \"kubernetes.io/projected/f6917089-254a-407e-ab54-a8085317ff82-kube-api-access-txz6h\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.351047 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-catalog-content\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.351087 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-utilities\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.351433 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-catalog-content\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.351501 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-utilities\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.370695 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txz6h\" (UniqueName: \"kubernetes.io/projected/f6917089-254a-407e-ab54-a8085317ff82-kube-api-access-txz6h\") pod \"redhat-marketplace-mpnqw\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.385577 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.458202 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.554358 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.554410 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.563347 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.582530 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.672486 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpnqw"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.702888 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m5xjh"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.705770 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.709683 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.714568 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5xjh"] Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.781081 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.781128 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.782453 4740 patch_prober.go:28] interesting pod/console-f9d7485db-g68sq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.782500 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g68sq" podUID="6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.803856 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-qzp8b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.804195 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qzp8b" podUID="f2de169d-9583-46e5-b2ee-da1a6903eafb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.803856 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-qzp8b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.804345 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qzp8b" podUID="f2de169d-9583-46e5-b2ee-da1a6903eafb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.860550 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd77s\" (UniqueName: \"kubernetes.io/projected/c009e3bf-3859-4e55-95c0-dc8049291674-kube-api-access-wd77s\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.860598 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-catalog-content\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.860656 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-utilities\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.891002 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpnqw" event={"ID":"f6917089-254a-407e-ab54-a8085317ff82","Type":"ContainerStarted","Data":"4aa133ec7523efd438148a75d74465e73748d47e48146e429ae304fa194fa603"} Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.894597 4740 generic.go:334] "Generic (PLEG): container finished" podID="13877acf-3046-4702-983c-5a3fc856477c" containerID="c47786ccd6b0f5ea879e44dc759021539a838f442b925c30ed3b5f01873d93da" exitCode=0 Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.894826 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l22jb" event={"ID":"13877acf-3046-4702-983c-5a3fc856477c","Type":"ContainerDied","Data":"c47786ccd6b0f5ea879e44dc759021539a838f442b925c30ed3b5f01873d93da"} Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.894874 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l22jb" event={"ID":"13877acf-3046-4702-983c-5a3fc856477c","Type":"ContainerStarted","Data":"ce1d0bcff60d952bea319fa9a9efb5f279f758a83b4ef34c7ce9f0e1854a518c"} Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.897102 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4e1acaf9-1d30-4def-ab26-b58667065c61","Type":"ContainerStarted","Data":"04676d5e7197ec5a31a02d01251660d75bc13d8cc1cf58c271f1b95be54d766e"} Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.904493 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lxzfg" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.962832 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd77s\" (UniqueName: \"kubernetes.io/projected/c009e3bf-3859-4e55-95c0-dc8049291674-kube-api-access-wd77s\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.962880 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-catalog-content\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.962919 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-utilities\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.963871 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-catalog-content\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.964183 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-utilities\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.986068 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.995131 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd77s\" (UniqueName: \"kubernetes.io/projected/c009e3bf-3859-4e55-95c0-dc8049291674-kube-api-access-wd77s\") pod \"redhat-operators-m5xjh\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:12 crc kubenswrapper[4740]: I1009 10:30:12.999949 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:12 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:12 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:12 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.000079 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.032209 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.103838 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xxdsj"] Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.106665 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.111989 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxdsj"] Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.266873 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-catalog-content\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.267171 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm8rm\" (UniqueName: \"kubernetes.io/projected/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-kube-api-access-wm8rm\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.267207 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-utilities\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.305117 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.368771 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-catalog-content\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.368819 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm8rm\" (UniqueName: \"kubernetes.io/projected/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-kube-api-access-wm8rm\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.368865 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-utilities\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.372322 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-utilities\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.372478 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-catalog-content\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.393007 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm8rm\" (UniqueName: \"kubernetes.io/projected/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-kube-api-access-wm8rm\") pod \"redhat-operators-xxdsj\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.431048 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.534215 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.631870 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5xjh"] Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.671385 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4411b16-07f8-4701-ad4f-7645a00e829f-secret-volume\") pod \"e4411b16-07f8-4701-ad4f-7645a00e829f\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.671492 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8ww7\" (UniqueName: \"kubernetes.io/projected/e4411b16-07f8-4701-ad4f-7645a00e829f-kube-api-access-j8ww7\") pod \"e4411b16-07f8-4701-ad4f-7645a00e829f\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.671543 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4411b16-07f8-4701-ad4f-7645a00e829f-config-volume\") pod \"e4411b16-07f8-4701-ad4f-7645a00e829f\" (UID: \"e4411b16-07f8-4701-ad4f-7645a00e829f\") " Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.672987 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4411b16-07f8-4701-ad4f-7645a00e829f-config-volume" (OuterVolumeSpecName: "config-volume") pod "e4411b16-07f8-4701-ad4f-7645a00e829f" (UID: "e4411b16-07f8-4701-ad4f-7645a00e829f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.675375 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4411b16-07f8-4701-ad4f-7645a00e829f-kube-api-access-j8ww7" (OuterVolumeSpecName: "kube-api-access-j8ww7") pod "e4411b16-07f8-4701-ad4f-7645a00e829f" (UID: "e4411b16-07f8-4701-ad4f-7645a00e829f"). InnerVolumeSpecName "kube-api-access-j8ww7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.677702 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4411b16-07f8-4701-ad4f-7645a00e829f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e4411b16-07f8-4701-ad4f-7645a00e829f" (UID: "e4411b16-07f8-4701-ad4f-7645a00e829f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.773782 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4411b16-07f8-4701-ad4f-7645a00e829f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.773889 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8ww7\" (UniqueName: \"kubernetes.io/projected/e4411b16-07f8-4701-ad4f-7645a00e829f-kube-api-access-j8ww7\") on node \"crc\" DevicePath \"\"" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.773900 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4411b16-07f8-4701-ad4f-7645a00e829f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.783433 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxdsj"] Oct 09 10:30:13 crc kubenswrapper[4740]: W1009 10:30:13.795827 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbfab557_0ba2_49b1_88ed_d9c9a23a0e6b.slice/crio-d32c81a27e6ed6646ad8abcc0fe1ab8f027c7146caa173c08c198f5e0e9b5f2a WatchSource:0}: Error finding container d32c81a27e6ed6646ad8abcc0fe1ab8f027c7146caa173c08c198f5e0e9b5f2a: Status 404 returned error can't find the container with id d32c81a27e6ed6646ad8abcc0fe1ab8f027c7146caa173c08c198f5e0e9b5f2a Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.915277 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" event={"ID":"e4411b16-07f8-4701-ad4f-7645a00e829f","Type":"ContainerDied","Data":"cfb65d4d5bfc3ea9d9ce3e7401b180911e73dbfd09d9a7def757777d6d4b196f"} Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.915323 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfb65d4d5bfc3ea9d9ce3e7401b180911e73dbfd09d9a7def757777d6d4b196f" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.915319 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf" Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.918698 4740 generic.go:334] "Generic (PLEG): container finished" podID="4e1acaf9-1d30-4def-ab26-b58667065c61" containerID="1fcf5610730ca8f190cad21a5a58419b2ae26ff7a0cd55c0ffb7242e7b0c5071" exitCode=0 Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.918776 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4e1acaf9-1d30-4def-ab26-b58667065c61","Type":"ContainerDied","Data":"1fcf5610730ca8f190cad21a5a58419b2ae26ff7a0cd55c0ffb7242e7b0c5071"} Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.925854 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5xjh" event={"ID":"c009e3bf-3859-4e55-95c0-dc8049291674","Type":"ContainerStarted","Data":"b7baf7925be29982ae7dd84757590a33ddfdae7044c79a3386ab2c7e627978f1"} Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.925901 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5xjh" event={"ID":"c009e3bf-3859-4e55-95c0-dc8049291674","Type":"ContainerStarted","Data":"e0d2ffdc3ca622ead0d97772a82452486e20812efedeada83868edaa2f086572"} Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.927740 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxdsj" event={"ID":"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b","Type":"ContainerStarted","Data":"d32c81a27e6ed6646ad8abcc0fe1ab8f027c7146caa173c08c198f5e0e9b5f2a"} Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.929653 4740 generic.go:334] "Generic (PLEG): container finished" podID="f6917089-254a-407e-ab54-a8085317ff82" containerID="eca573bf16b4a0022494769158fd04296a17ac1a2ffbfee9d4b9a85c67832e68" exitCode=0 Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.929725 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpnqw" event={"ID":"f6917089-254a-407e-ab54-a8085317ff82","Type":"ContainerDied","Data":"eca573bf16b4a0022494769158fd04296a17ac1a2ffbfee9d4b9a85c67832e68"} Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.997636 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:13 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:13 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:13 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:13 crc kubenswrapper[4740]: I1009 10:30:13.998055 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.386969 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 10:30:14 crc kubenswrapper[4740]: E1009 10:30:14.387244 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4411b16-07f8-4701-ad4f-7645a00e829f" containerName="collect-profiles" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.387259 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4411b16-07f8-4701-ad4f-7645a00e829f" containerName="collect-profiles" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.387412 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4411b16-07f8-4701-ad4f-7645a00e829f" containerName="collect-profiles" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.388074 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.392420 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.393536 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.394976 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.486176 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30103fa4-b346-46dd-b1ac-478a2680acc9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"30103fa4-b346-46dd-b1ac-478a2680acc9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.486235 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30103fa4-b346-46dd-b1ac-478a2680acc9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"30103fa4-b346-46dd-b1ac-478a2680acc9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.588067 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30103fa4-b346-46dd-b1ac-478a2680acc9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"30103fa4-b346-46dd-b1ac-478a2680acc9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.588116 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30103fa4-b346-46dd-b1ac-478a2680acc9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"30103fa4-b346-46dd-b1ac-478a2680acc9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.588194 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30103fa4-b346-46dd-b1ac-478a2680acc9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"30103fa4-b346-46dd-b1ac-478a2680acc9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.613408 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30103fa4-b346-46dd-b1ac-478a2680acc9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"30103fa4-b346-46dd-b1ac-478a2680acc9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.790010 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.959828 4740 generic.go:334] "Generic (PLEG): container finished" podID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerID="f01b824ad97841dbeab6440fb4ec0f58f9bf2d374c13d4f05aec0144e04418ab" exitCode=0 Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.959980 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxdsj" event={"ID":"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b","Type":"ContainerDied","Data":"f01b824ad97841dbeab6440fb4ec0f58f9bf2d374c13d4f05aec0144e04418ab"} Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.964829 4740 generic.go:334] "Generic (PLEG): container finished" podID="c009e3bf-3859-4e55-95c0-dc8049291674" containerID="b7baf7925be29982ae7dd84757590a33ddfdae7044c79a3386ab2c7e627978f1" exitCode=0 Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.964995 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5xjh" event={"ID":"c009e3bf-3859-4e55-95c0-dc8049291674","Type":"ContainerDied","Data":"b7baf7925be29982ae7dd84757590a33ddfdae7044c79a3386ab2c7e627978f1"} Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.984095 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:14 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:14 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:14 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:14 crc kubenswrapper[4740]: I1009 10:30:14.984155 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.075508 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 10:30:15 crc kubenswrapper[4740]: W1009 10:30:15.132556 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod30103fa4_b346_46dd_b1ac_478a2680acc9.slice/crio-ee1f7ef00260102d95dafaff794d75ea2ade948d1e86f3fc54638b6ca13b35a7 WatchSource:0}: Error finding container ee1f7ef00260102d95dafaff794d75ea2ade948d1e86f3fc54638b6ca13b35a7: Status 404 returned error can't find the container with id ee1f7ef00260102d95dafaff794d75ea2ade948d1e86f3fc54638b6ca13b35a7 Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.332354 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.416169 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e1acaf9-1d30-4def-ab26-b58667065c61-kubelet-dir\") pod \"4e1acaf9-1d30-4def-ab26-b58667065c61\" (UID: \"4e1acaf9-1d30-4def-ab26-b58667065c61\") " Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.416282 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e1acaf9-1d30-4def-ab26-b58667065c61-kube-api-access\") pod \"4e1acaf9-1d30-4def-ab26-b58667065c61\" (UID: \"4e1acaf9-1d30-4def-ab26-b58667065c61\") " Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.417784 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e1acaf9-1d30-4def-ab26-b58667065c61-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4e1acaf9-1d30-4def-ab26-b58667065c61" (UID: "4e1acaf9-1d30-4def-ab26-b58667065c61"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.421985 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1acaf9-1d30-4def-ab26-b58667065c61-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4e1acaf9-1d30-4def-ab26-b58667065c61" (UID: "4e1acaf9-1d30-4def-ab26-b58667065c61"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.517838 4740 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e1acaf9-1d30-4def-ab26-b58667065c61-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.517869 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e1acaf9-1d30-4def-ab26-b58667065c61-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.982808 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:15 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:15 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:15 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.982883 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.993796 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"30103fa4-b346-46dd-b1ac-478a2680acc9","Type":"ContainerStarted","Data":"b03c323ff5340ef99fec396e25e96156b6353ff4a71f0cb50db351c2766d86f7"} Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.993943 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"30103fa4-b346-46dd-b1ac-478a2680acc9","Type":"ContainerStarted","Data":"ee1f7ef00260102d95dafaff794d75ea2ade948d1e86f3fc54638b6ca13b35a7"} Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.997678 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4e1acaf9-1d30-4def-ab26-b58667065c61","Type":"ContainerDied","Data":"04676d5e7197ec5a31a02d01251660d75bc13d8cc1cf58c271f1b95be54d766e"} Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.997724 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04676d5e7197ec5a31a02d01251660d75bc13d8cc1cf58c271f1b95be54d766e" Oct 09 10:30:15 crc kubenswrapper[4740]: I1009 10:30:15.997737 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 10:30:16 crc kubenswrapper[4740]: I1009 10:30:16.981792 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:16 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:16 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:16 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:16 crc kubenswrapper[4740]: I1009 10:30:16.981844 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:17 crc kubenswrapper[4740]: I1009 10:30:17.018395 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.018372719 podStartE2EDuration="3.018372719s" podCreationTimestamp="2025-10-09 10:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:17.018117233 +0000 UTC m=+155.980317614" watchObservedRunningTime="2025-10-09 10:30:17.018372719 +0000 UTC m=+155.980573100" Oct 09 10:30:17 crc kubenswrapper[4740]: I1009 10:30:17.981447 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:17 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:17 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:17 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:17 crc kubenswrapper[4740]: I1009 10:30:17.981502 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:18 crc kubenswrapper[4740]: I1009 10:30:18.017961 4740 generic.go:334] "Generic (PLEG): container finished" podID="30103fa4-b346-46dd-b1ac-478a2680acc9" containerID="b03c323ff5340ef99fec396e25e96156b6353ff4a71f0cb50db351c2766d86f7" exitCode=0 Oct 09 10:30:18 crc kubenswrapper[4740]: I1009 10:30:18.018008 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"30103fa4-b346-46dd-b1ac-478a2680acc9","Type":"ContainerDied","Data":"b03c323ff5340ef99fec396e25e96156b6353ff4a71f0cb50db351c2766d86f7"} Oct 09 10:30:18 crc kubenswrapper[4740]: I1009 10:30:18.407318 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xhb6x" Oct 09 10:30:18 crc kubenswrapper[4740]: I1009 10:30:18.983361 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:18 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:18 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:18 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:18 crc kubenswrapper[4740]: I1009 10:30:18.983670 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:19 crc kubenswrapper[4740]: I1009 10:30:19.983293 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:19 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:19 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:19 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:19 crc kubenswrapper[4740]: I1009 10:30:19.983631 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:20 crc kubenswrapper[4740]: I1009 10:30:20.981930 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:20 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:20 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:20 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:20 crc kubenswrapper[4740]: I1009 10:30:20.981992 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:21 crc kubenswrapper[4740]: I1009 10:30:21.983696 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:21 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:21 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:21 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:21 crc kubenswrapper[4740]: I1009 10:30:21.984067 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.557941 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.781124 4740 patch_prober.go:28] interesting pod/console-f9d7485db-g68sq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.781836 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g68sq" podUID="6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.804893 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-qzp8b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.804948 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qzp8b" podUID="f2de169d-9583-46e5-b2ee-da1a6903eafb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.805293 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-qzp8b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.805318 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qzp8b" podUID="f2de169d-9583-46e5-b2ee-da1a6903eafb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.981957 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:22 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:22 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:22 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:22 crc kubenswrapper[4740]: I1009 10:30:22.982008 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:23 crc kubenswrapper[4740]: I1009 10:30:23.931526 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:30:23 crc kubenswrapper[4740]: I1009 10:30:23.945665 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01aecf36-9a78-414c-8078-5c114c1dfa3f-metrics-certs\") pod \"network-metrics-daemon-z74b9\" (UID: \"01aecf36-9a78-414c-8078-5c114c1dfa3f\") " pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:30:23 crc kubenswrapper[4740]: I1009 10:30:23.986033 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:23 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:23 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:23 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:23 crc kubenswrapper[4740]: I1009 10:30:23.986097 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.183773 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z74b9" Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.811129 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.946349 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30103fa4-b346-46dd-b1ac-478a2680acc9-kubelet-dir\") pod \"30103fa4-b346-46dd-b1ac-478a2680acc9\" (UID: \"30103fa4-b346-46dd-b1ac-478a2680acc9\") " Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.946447 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30103fa4-b346-46dd-b1ac-478a2680acc9-kube-api-access\") pod \"30103fa4-b346-46dd-b1ac-478a2680acc9\" (UID: \"30103fa4-b346-46dd-b1ac-478a2680acc9\") " Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.946488 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30103fa4-b346-46dd-b1ac-478a2680acc9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "30103fa4-b346-46dd-b1ac-478a2680acc9" (UID: "30103fa4-b346-46dd-b1ac-478a2680acc9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.946696 4740 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30103fa4-b346-46dd-b1ac-478a2680acc9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.953781 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30103fa4-b346-46dd-b1ac-478a2680acc9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "30103fa4-b346-46dd-b1ac-478a2680acc9" (UID: "30103fa4-b346-46dd-b1ac-478a2680acc9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.983182 4740 patch_prober.go:28] interesting pod/router-default-5444994796-vntt9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 10:30:24 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Oct 09 10:30:24 crc kubenswrapper[4740]: [+]process-running ok Oct 09 10:30:24 crc kubenswrapper[4740]: healthz check failed Oct 09 10:30:24 crc kubenswrapper[4740]: I1009 10:30:24.983246 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vntt9" podUID="c9c38db8-21e3-495b-b6db-3ea52bec9b5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 10:30:25 crc kubenswrapper[4740]: I1009 10:30:25.048214 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30103fa4-b346-46dd-b1ac-478a2680acc9-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 10:30:25 crc kubenswrapper[4740]: I1009 10:30:25.057714 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"30103fa4-b346-46dd-b1ac-478a2680acc9","Type":"ContainerDied","Data":"ee1f7ef00260102d95dafaff794d75ea2ade948d1e86f3fc54638b6ca13b35a7"} Oct 09 10:30:25 crc kubenswrapper[4740]: I1009 10:30:25.057812 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee1f7ef00260102d95dafaff794d75ea2ade948d1e86f3fc54638b6ca13b35a7" Oct 09 10:30:25 crc kubenswrapper[4740]: I1009 10:30:25.057819 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 10:30:25 crc kubenswrapper[4740]: I1009 10:30:25.981495 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:25 crc kubenswrapper[4740]: I1009 10:30:25.983802 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vntt9" Oct 09 10:30:30 crc kubenswrapper[4740]: I1009 10:30:30.299737 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:30:32 crc kubenswrapper[4740]: I1009 10:30:32.787575 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:32 crc kubenswrapper[4740]: I1009 10:30:32.792516 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:30:32 crc kubenswrapper[4740]: I1009 10:30:32.834804 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qzp8b" Oct 09 10:30:35 crc kubenswrapper[4740]: I1009 10:30:35.407677 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:30:35 crc kubenswrapper[4740]: I1009 10:30:35.408935 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:30:43 crc kubenswrapper[4740]: I1009 10:30:43.064786 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w956x" Oct 09 10:30:43 crc kubenswrapper[4740]: E1009 10:30:43.574589 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 10:30:43 crc kubenswrapper[4740]: E1009 10:30:43.574796 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jf9rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-l22jb_openshift-marketplace(13877acf-3046-4702-983c-5a3fc856477c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 10:30:43 crc kubenswrapper[4740]: E1009 10:30:43.576022 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-l22jb" podUID="13877acf-3046-4702-983c-5a3fc856477c" Oct 09 10:30:46 crc kubenswrapper[4740]: E1009 10:30:46.602868 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 10:30:46 crc kubenswrapper[4740]: E1009 10:30:46.603012 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txz6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mpnqw_openshift-marketplace(f6917089-254a-407e-ab54-a8085317ff82): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 10:30:46 crc kubenswrapper[4740]: E1009 10:30:46.604265 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mpnqw" podUID="f6917089-254a-407e-ab54-a8085317ff82" Oct 09 10:30:47 crc kubenswrapper[4740]: E1009 10:30:47.471730 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-l22jb" podUID="13877acf-3046-4702-983c-5a3fc856477c" Oct 09 10:30:47 crc kubenswrapper[4740]: E1009 10:30:47.472474 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mpnqw" podUID="f6917089-254a-407e-ab54-a8085317ff82" Oct 09 10:30:48 crc kubenswrapper[4740]: E1009 10:30:48.154840 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 10:30:48 crc kubenswrapper[4740]: E1009 10:30:48.155023 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-snkxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-59flj_openshift-marketplace(12c19a62-c9c0-4895-923b-4ac55e0f7c90): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 10:30:48 crc kubenswrapper[4740]: E1009 10:30:48.156691 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-59flj" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" Oct 09 10:30:48 crc kubenswrapper[4740]: E1009 10:30:48.288051 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 10:30:48 crc kubenswrapper[4740]: E1009 10:30:48.288229 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vzgkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jckll_openshift-marketplace(e86840e3-2c55-417d-9fa9-6eccaa01ad1a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 10:30:48 crc kubenswrapper[4740]: E1009 10:30:48.291507 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jckll" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" Oct 09 10:30:49 crc kubenswrapper[4740]: I1009 10:30:49.783243 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 10:30:50 crc kubenswrapper[4740]: E1009 10:30:50.589881 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-59flj" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" Oct 09 10:30:50 crc kubenswrapper[4740]: E1009 10:30:50.589971 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jckll" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" Oct 09 10:30:50 crc kubenswrapper[4740]: E1009 10:30:50.674587 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 09 10:30:50 crc kubenswrapper[4740]: E1009 10:30:50.674975 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wm8rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xxdsj_openshift-marketplace(dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 10:30:50 crc kubenswrapper[4740]: E1009 10:30:50.676182 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xxdsj" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.857125 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xxdsj" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.923531 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.923993 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vnlgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lrbjh_openshift-marketplace(b3267f79-181b-4b3e-b0c6-eba2901bf0cc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.925196 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lrbjh" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.928665 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.928823 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5t8gm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-c6t4k_openshift-marketplace(ad2674eb-8f79-42a7-8d74-906279aaea2c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.929889 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-c6t4k" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.951833 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.951967 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wd77s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-m5xjh_openshift-marketplace(c009e3bf-3859-4e55-95c0-dc8049291674): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 10:30:51 crc kubenswrapper[4740]: E1009 10:30:51.953152 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-m5xjh" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" Oct 09 10:30:52 crc kubenswrapper[4740]: E1009 10:30:52.211829 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-m5xjh" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" Oct 09 10:30:52 crc kubenswrapper[4740]: E1009 10:30:52.211939 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-c6t4k" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" Oct 09 10:30:52 crc kubenswrapper[4740]: E1009 10:30:52.211976 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lrbjh" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" Oct 09 10:30:52 crc kubenswrapper[4740]: I1009 10:30:52.280562 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z74b9"] Oct 09 10:30:53 crc kubenswrapper[4740]: I1009 10:30:53.216549 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z74b9" event={"ID":"01aecf36-9a78-414c-8078-5c114c1dfa3f","Type":"ContainerStarted","Data":"0c2167ecfc2b41755a831c6df370f5756dbeceaeab65cf462f631a1814a1e2f0"} Oct 09 10:30:53 crc kubenswrapper[4740]: I1009 10:30:53.216955 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z74b9" event={"ID":"01aecf36-9a78-414c-8078-5c114c1dfa3f","Type":"ContainerStarted","Data":"2c2a55134eb9f1f010d27a4958a6b3e19b28e1be1eefd503194ff7aa2561ea3e"} Oct 09 10:30:53 crc kubenswrapper[4740]: I1009 10:30:53.216974 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z74b9" event={"ID":"01aecf36-9a78-414c-8078-5c114c1dfa3f","Type":"ContainerStarted","Data":"31caee620003dc599180403719b9bdb4cd45896197602bf2e1c45a9e80369dea"} Oct 09 10:30:58 crc kubenswrapper[4740]: I1009 10:30:58.784107 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z74b9" podStartSLOduration=177.784088229 podStartE2EDuration="2m57.784088229s" podCreationTimestamp="2025-10-09 10:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:30:53.231897915 +0000 UTC m=+192.194098296" watchObservedRunningTime="2025-10-09 10:30:58.784088229 +0000 UTC m=+197.746288610" Oct 09 10:31:01 crc kubenswrapper[4740]: I1009 10:31:01.271822 4740 generic.go:334] "Generic (PLEG): container finished" podID="f6917089-254a-407e-ab54-a8085317ff82" containerID="ab02d6a942ae97069fecaadf0e614afe0a58d36b714552931661fed9dc40179a" exitCode=0 Oct 09 10:31:01 crc kubenswrapper[4740]: I1009 10:31:01.271930 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpnqw" event={"ID":"f6917089-254a-407e-ab54-a8085317ff82","Type":"ContainerDied","Data":"ab02d6a942ae97069fecaadf0e614afe0a58d36b714552931661fed9dc40179a"} Oct 09 10:31:02 crc kubenswrapper[4740]: I1009 10:31:02.279315 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpnqw" event={"ID":"f6917089-254a-407e-ab54-a8085317ff82","Type":"ContainerStarted","Data":"03f65739374954efa957ac3eb917c3385211843718353eb68b12be833a94bc39"} Oct 09 10:31:02 crc kubenswrapper[4740]: I1009 10:31:02.280565 4740 generic.go:334] "Generic (PLEG): container finished" podID="13877acf-3046-4702-983c-5a3fc856477c" containerID="3c7616f12c693d606d48f235b082d4155aae6f099a1d4ca3ce5c3a22ad375c6d" exitCode=0 Oct 09 10:31:02 crc kubenswrapper[4740]: I1009 10:31:02.280609 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l22jb" event={"ID":"13877acf-3046-4702-983c-5a3fc856477c","Type":"ContainerDied","Data":"3c7616f12c693d606d48f235b082d4155aae6f099a1d4ca3ce5c3a22ad375c6d"} Oct 09 10:31:02 crc kubenswrapper[4740]: I1009 10:31:02.320026 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mpnqw" podStartSLOduration=2.378277443 podStartE2EDuration="50.319992765s" podCreationTimestamp="2025-10-09 10:30:12 +0000 UTC" firstStartedPulling="2025-10-09 10:30:13.931253649 +0000 UTC m=+152.893454030" lastFinishedPulling="2025-10-09 10:31:01.872968971 +0000 UTC m=+200.835169352" observedRunningTime="2025-10-09 10:31:02.300049878 +0000 UTC m=+201.262250259" watchObservedRunningTime="2025-10-09 10:31:02.319992765 +0000 UTC m=+201.282193166" Oct 09 10:31:02 crc kubenswrapper[4740]: I1009 10:31:02.458393 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:31:02 crc kubenswrapper[4740]: I1009 10:31:02.458463 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:31:03 crc kubenswrapper[4740]: I1009 10:31:03.286991 4740 generic.go:334] "Generic (PLEG): container finished" podID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerID="2d0f87e9bf3e7e002180f01ac9d1c2dde92a0e0657837a93755df312ea9f704e" exitCode=0 Oct 09 10:31:03 crc kubenswrapper[4740]: I1009 10:31:03.287077 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59flj" event={"ID":"12c19a62-c9c0-4895-923b-4ac55e0f7c90","Type":"ContainerDied","Data":"2d0f87e9bf3e7e002180f01ac9d1c2dde92a0e0657837a93755df312ea9f704e"} Oct 09 10:31:03 crc kubenswrapper[4740]: I1009 10:31:03.289744 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l22jb" event={"ID":"13877acf-3046-4702-983c-5a3fc856477c","Type":"ContainerStarted","Data":"c9f5b4de7cb04660eef22387d92a81341dc8ce08b9b63b2ce977850c115f51e8"} Oct 09 10:31:03 crc kubenswrapper[4740]: I1009 10:31:03.332078 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l22jb" podStartSLOduration=2.17166798 podStartE2EDuration="52.332062501s" podCreationTimestamp="2025-10-09 10:30:11 +0000 UTC" firstStartedPulling="2025-10-09 10:30:12.895891226 +0000 UTC m=+151.858091607" lastFinishedPulling="2025-10-09 10:31:03.056285747 +0000 UTC m=+202.018486128" observedRunningTime="2025-10-09 10:31:03.328651948 +0000 UTC m=+202.290852329" watchObservedRunningTime="2025-10-09 10:31:03.332062501 +0000 UTC m=+202.294262892" Oct 09 10:31:03 crc kubenswrapper[4740]: I1009 10:31:03.612947 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mpnqw" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="registry-server" probeResult="failure" output=< Oct 09 10:31:03 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Oct 09 10:31:03 crc kubenswrapper[4740]: > Oct 09 10:31:04 crc kubenswrapper[4740]: I1009 10:31:04.296035 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxdsj" event={"ID":"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b","Type":"ContainerStarted","Data":"0e231d9fdc2f75c05a8514424065e4068c397f857733027d1b3a34786884769e"} Oct 09 10:31:04 crc kubenswrapper[4740]: I1009 10:31:04.297727 4740 generic.go:334] "Generic (PLEG): container finished" podID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerID="bfa0d7432a83b07f9c85d0b63f8941adff0c4d78e5a3c3fd293fbf88f1cbb138" exitCode=0 Oct 09 10:31:04 crc kubenswrapper[4740]: I1009 10:31:04.297769 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t4k" event={"ID":"ad2674eb-8f79-42a7-8d74-906279aaea2c","Type":"ContainerDied","Data":"bfa0d7432a83b07f9c85d0b63f8941adff0c4d78e5a3c3fd293fbf88f1cbb138"} Oct 09 10:31:04 crc kubenswrapper[4740]: I1009 10:31:04.304253 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59flj" event={"ID":"12c19a62-c9c0-4895-923b-4ac55e0f7c90","Type":"ContainerStarted","Data":"208c3a124a2a08459b26604b205835d6eb5209af75da75f38fa0ee1d5e9c2845"} Oct 09 10:31:04 crc kubenswrapper[4740]: I1009 10:31:04.332501 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-59flj" podStartSLOduration=3.382348621 podStartE2EDuration="55.332462347s" podCreationTimestamp="2025-10-09 10:30:09 +0000 UTC" firstStartedPulling="2025-10-09 10:30:11.85702828 +0000 UTC m=+150.819228661" lastFinishedPulling="2025-10-09 10:31:03.807142006 +0000 UTC m=+202.769342387" observedRunningTime="2025-10-09 10:31:04.331277511 +0000 UTC m=+203.293477912" watchObservedRunningTime="2025-10-09 10:31:04.332462347 +0000 UTC m=+203.294662728" Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.314023 4740 generic.go:334] "Generic (PLEG): container finished" podID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerID="0e231d9fdc2f75c05a8514424065e4068c397f857733027d1b3a34786884769e" exitCode=0 Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.314222 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxdsj" event={"ID":"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b","Type":"ContainerDied","Data":"0e231d9fdc2f75c05a8514424065e4068c397f857733027d1b3a34786884769e"} Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.318800 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t4k" event={"ID":"ad2674eb-8f79-42a7-8d74-906279aaea2c","Type":"ContainerStarted","Data":"e0871ae4e564e7de56b73fdbb19cb91efc6cb3cf0e70bbb1afc0eb2138aa208c"} Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.320856 4740 generic.go:334] "Generic (PLEG): container finished" podID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerID="c85899e4d065dc4235a56bc2bc70b1ee8953fb242b5f9f4a8629d6f1f89e107f" exitCode=0 Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.320896 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jckll" event={"ID":"e86840e3-2c55-417d-9fa9-6eccaa01ad1a","Type":"ContainerDied","Data":"c85899e4d065dc4235a56bc2bc70b1ee8953fb242b5f9f4a8629d6f1f89e107f"} Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.379363 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c6t4k" podStartSLOduration=2.550281025 podStartE2EDuration="55.379339694s" podCreationTimestamp="2025-10-09 10:30:10 +0000 UTC" firstStartedPulling="2025-10-09 10:30:11.861896779 +0000 UTC m=+150.824097180" lastFinishedPulling="2025-10-09 10:31:04.690955468 +0000 UTC m=+203.653155849" observedRunningTime="2025-10-09 10:31:05.377811898 +0000 UTC m=+204.340012299" watchObservedRunningTime="2025-10-09 10:31:05.379339694 +0000 UTC m=+204.341540095" Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.408110 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.408211 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.408349 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.409494 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 10:31:05 crc kubenswrapper[4740]: I1009 10:31:05.409862 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f" gracePeriod=600 Oct 09 10:31:06 crc kubenswrapper[4740]: I1009 10:31:06.328922 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxdsj" event={"ID":"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b","Type":"ContainerStarted","Data":"d94dcd873b368d5bc043b95c57c7e4d4475a78b9177e5dbe6601450705191d83"} Oct 09 10:31:06 crc kubenswrapper[4740]: I1009 10:31:06.330974 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f" exitCode=0 Oct 09 10:31:06 crc kubenswrapper[4740]: I1009 10:31:06.331049 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f"} Oct 09 10:31:06 crc kubenswrapper[4740]: I1009 10:31:06.331104 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"63604de549d11fb7c2176acf5b52492977a036f22bcf527081116471ae69cb41"} Oct 09 10:31:06 crc kubenswrapper[4740]: I1009 10:31:06.332963 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jckll" event={"ID":"e86840e3-2c55-417d-9fa9-6eccaa01ad1a","Type":"ContainerStarted","Data":"be45ce4ea72aff19cca6f8a8001dec6f8c5a001834a2e413b65c9ca29d6c784a"} Oct 09 10:31:06 crc kubenswrapper[4740]: I1009 10:31:06.345876 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xxdsj" podStartSLOduration=2.3291145220000002 podStartE2EDuration="53.345858695s" podCreationTimestamp="2025-10-09 10:30:13 +0000 UTC" firstStartedPulling="2025-10-09 10:30:14.964364333 +0000 UTC m=+153.926564714" lastFinishedPulling="2025-10-09 10:31:05.981108506 +0000 UTC m=+204.943308887" observedRunningTime="2025-10-09 10:31:06.345254227 +0000 UTC m=+205.307454608" watchObservedRunningTime="2025-10-09 10:31:06.345858695 +0000 UTC m=+205.308059076" Oct 09 10:31:06 crc kubenswrapper[4740]: I1009 10:31:06.365092 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jckll" podStartSLOduration=3.294518615 podStartE2EDuration="57.365071751s" podCreationTimestamp="2025-10-09 10:30:09 +0000 UTC" firstStartedPulling="2025-10-09 10:30:11.866679115 +0000 UTC m=+150.828879516" lastFinishedPulling="2025-10-09 10:31:05.937232251 +0000 UTC m=+204.899432652" observedRunningTime="2025-10-09 10:31:06.362011689 +0000 UTC m=+205.324212080" watchObservedRunningTime="2025-10-09 10:31:06.365071751 +0000 UTC m=+205.327272142" Oct 09 10:31:07 crc kubenswrapper[4740]: I1009 10:31:07.339808 4740 generic.go:334] "Generic (PLEG): container finished" podID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerID="de032ebbb0a29e30b9558f11b0024b295856242043de1a730755a8cc3f52bae4" exitCode=0 Oct 09 10:31:07 crc kubenswrapper[4740]: I1009 10:31:07.339882 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrbjh" event={"ID":"b3267f79-181b-4b3e-b0c6-eba2901bf0cc","Type":"ContainerDied","Data":"de032ebbb0a29e30b9558f11b0024b295856242043de1a730755a8cc3f52bae4"} Oct 09 10:31:08 crc kubenswrapper[4740]: I1009 10:31:08.348662 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5xjh" event={"ID":"c009e3bf-3859-4e55-95c0-dc8049291674","Type":"ContainerStarted","Data":"ed77283d0d7834790b8aa54ab234f94261a771692b284177aeb0205ff3bf70fb"} Oct 09 10:31:08 crc kubenswrapper[4740]: I1009 10:31:08.350929 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrbjh" event={"ID":"b3267f79-181b-4b3e-b0c6-eba2901bf0cc","Type":"ContainerStarted","Data":"23f0b623e9ea42cab5f7e4b756e0107bb34f00c077d59f84cb3d49830cab4af7"} Oct 09 10:31:08 crc kubenswrapper[4740]: I1009 10:31:08.392282 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lrbjh" podStartSLOduration=3.420148556 podStartE2EDuration="59.392251352s" podCreationTimestamp="2025-10-09 10:30:09 +0000 UTC" firstStartedPulling="2025-10-09 10:30:11.85588969 +0000 UTC m=+150.818090071" lastFinishedPulling="2025-10-09 10:31:07.827992486 +0000 UTC m=+206.790192867" observedRunningTime="2025-10-09 10:31:08.391317254 +0000 UTC m=+207.353517655" watchObservedRunningTime="2025-10-09 10:31:08.392251352 +0000 UTC m=+207.354451773" Oct 09 10:31:09 crc kubenswrapper[4740]: I1009 10:31:09.356904 4740 generic.go:334] "Generic (PLEG): container finished" podID="c009e3bf-3859-4e55-95c0-dc8049291674" containerID="ed77283d0d7834790b8aa54ab234f94261a771692b284177aeb0205ff3bf70fb" exitCode=0 Oct 09 10:31:09 crc kubenswrapper[4740]: I1009 10:31:09.356993 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5xjh" event={"ID":"c009e3bf-3859-4e55-95c0-dc8049291674","Type":"ContainerDied","Data":"ed77283d0d7834790b8aa54ab234f94261a771692b284177aeb0205ff3bf70fb"} Oct 09 10:31:09 crc kubenswrapper[4740]: I1009 10:31:09.812209 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:31:09 crc kubenswrapper[4740]: I1009 10:31:09.813213 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.047289 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.048617 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.060898 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.090519 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.226422 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.226481 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.267087 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.416273 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.437369 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.437629 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:31:10 crc kubenswrapper[4740]: I1009 10:31:10.479021 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:31:11 crc kubenswrapper[4740]: I1009 10:31:11.202063 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59flj"] Oct 09 10:31:11 crc kubenswrapper[4740]: I1009 10:31:11.404411 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:31:11 crc kubenswrapper[4740]: I1009 10:31:11.404467 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:31:12 crc kubenswrapper[4740]: I1009 10:31:12.008412 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:31:12 crc kubenswrapper[4740]: I1009 10:31:12.008460 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:31:12 crc kubenswrapper[4740]: I1009 10:31:12.045589 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:31:12 crc kubenswrapper[4740]: I1009 10:31:12.370110 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-59flj" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerName="registry-server" containerID="cri-o://208c3a124a2a08459b26604b205835d6eb5209af75da75f38fa0ee1d5e9c2845" gracePeriod=2 Oct 09 10:31:12 crc kubenswrapper[4740]: I1009 10:31:12.410163 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:31:12 crc kubenswrapper[4740]: I1009 10:31:12.495126 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:31:12 crc kubenswrapper[4740]: I1009 10:31:12.532462 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:31:13 crc kubenswrapper[4740]: I1009 10:31:13.001681 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c6t4k"] Oct 09 10:31:13 crc kubenswrapper[4740]: I1009 10:31:13.382603 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59flj" event={"ID":"12c19a62-c9c0-4895-923b-4ac55e0f7c90","Type":"ContainerDied","Data":"208c3a124a2a08459b26604b205835d6eb5209af75da75f38fa0ee1d5e9c2845"} Oct 09 10:31:13 crc kubenswrapper[4740]: I1009 10:31:13.382726 4740 generic.go:334] "Generic (PLEG): container finished" podID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerID="208c3a124a2a08459b26604b205835d6eb5209af75da75f38fa0ee1d5e9c2845" exitCode=0 Oct 09 10:31:13 crc kubenswrapper[4740]: I1009 10:31:13.383116 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c6t4k" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerName="registry-server" containerID="cri-o://e0871ae4e564e7de56b73fdbb19cb91efc6cb3cf0e70bbb1afc0eb2138aa208c" gracePeriod=2 Oct 09 10:31:13 crc kubenswrapper[4740]: I1009 10:31:13.431567 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:31:13 crc kubenswrapper[4740]: I1009 10:31:13.431628 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:31:13 crc kubenswrapper[4740]: I1009 10:31:13.472025 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.164584 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.261977 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-utilities\") pod \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.262030 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-catalog-content\") pod \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.263599 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snkxt\" (UniqueName: \"kubernetes.io/projected/12c19a62-c9c0-4895-923b-4ac55e0f7c90-kube-api-access-snkxt\") pod \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\" (UID: \"12c19a62-c9c0-4895-923b-4ac55e0f7c90\") " Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.264305 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-utilities" (OuterVolumeSpecName: "utilities") pod "12c19a62-c9c0-4895-923b-4ac55e0f7c90" (UID: "12c19a62-c9c0-4895-923b-4ac55e0f7c90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.269940 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c19a62-c9c0-4895-923b-4ac55e0f7c90-kube-api-access-snkxt" (OuterVolumeSpecName: "kube-api-access-snkxt") pod "12c19a62-c9c0-4895-923b-4ac55e0f7c90" (UID: "12c19a62-c9c0-4895-923b-4ac55e0f7c90"). InnerVolumeSpecName "kube-api-access-snkxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.364658 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.364696 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snkxt\" (UniqueName: \"kubernetes.io/projected/12c19a62-c9c0-4895-923b-4ac55e0f7c90-kube-api-access-snkxt\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.390042 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59flj" event={"ID":"12c19a62-c9c0-4895-923b-4ac55e0f7c90","Type":"ContainerDied","Data":"ddc80bfe353586c6590adbc9fafcd6f2a197a771e754550224b6c732a5067c9c"} Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.390096 4740 scope.go:117] "RemoveContainer" containerID="208c3a124a2a08459b26604b205835d6eb5209af75da75f38fa0ee1d5e9c2845" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.390214 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59flj" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.393854 4740 generic.go:334] "Generic (PLEG): container finished" podID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerID="e0871ae4e564e7de56b73fdbb19cb91efc6cb3cf0e70bbb1afc0eb2138aa208c" exitCode=0 Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.393894 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t4k" event={"ID":"ad2674eb-8f79-42a7-8d74-906279aaea2c","Type":"ContainerDied","Data":"e0871ae4e564e7de56b73fdbb19cb91efc6cb3cf0e70bbb1afc0eb2138aa208c"} Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.396449 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5xjh" event={"ID":"c009e3bf-3859-4e55-95c0-dc8049291674","Type":"ContainerStarted","Data":"2ae301d42df4039b2b39d02d335bc34265890c31fda2170845ab746fbf95d622"} Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.405740 4740 scope.go:117] "RemoveContainer" containerID="2d0f87e9bf3e7e002180f01ac9d1c2dde92a0e0657837a93755df312ea9f704e" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.431678 4740 scope.go:117] "RemoveContainer" containerID="05540e0826659e9bf1e7a3e53a0579345fb517d509f5594b55b7cbdfab42dab4" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.444520 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:31:14 crc kubenswrapper[4740]: I1009 10:31:14.466192 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m5xjh" podStartSLOduration=4.299848408 podStartE2EDuration="1m2.466174629s" podCreationTimestamp="2025-10-09 10:30:12 +0000 UTC" firstStartedPulling="2025-10-09 10:30:14.967337871 +0000 UTC m=+153.929538252" lastFinishedPulling="2025-10-09 10:31:13.133664092 +0000 UTC m=+212.095864473" observedRunningTime="2025-10-09 10:31:14.418486331 +0000 UTC m=+213.380686712" watchObservedRunningTime="2025-10-09 10:31:14.466174629 +0000 UTC m=+213.428375010" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.261566 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12c19a62-c9c0-4895-923b-4ac55e0f7c90" (UID: "12c19a62-c9c0-4895-923b-4ac55e0f7c90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.275262 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c19a62-c9c0-4895-923b-4ac55e0f7c90-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.287919 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.321434 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59flj"] Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.324840 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-59flj"] Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.376340 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-catalog-content\") pod \"ad2674eb-8f79-42a7-8d74-906279aaea2c\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.376448 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t8gm\" (UniqueName: \"kubernetes.io/projected/ad2674eb-8f79-42a7-8d74-906279aaea2c-kube-api-access-5t8gm\") pod \"ad2674eb-8f79-42a7-8d74-906279aaea2c\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.376484 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-utilities\") pod \"ad2674eb-8f79-42a7-8d74-906279aaea2c\" (UID: \"ad2674eb-8f79-42a7-8d74-906279aaea2c\") " Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.377209 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-utilities" (OuterVolumeSpecName: "utilities") pod "ad2674eb-8f79-42a7-8d74-906279aaea2c" (UID: "ad2674eb-8f79-42a7-8d74-906279aaea2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.381455 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2674eb-8f79-42a7-8d74-906279aaea2c-kube-api-access-5t8gm" (OuterVolumeSpecName: "kube-api-access-5t8gm") pod "ad2674eb-8f79-42a7-8d74-906279aaea2c" (UID: "ad2674eb-8f79-42a7-8d74-906279aaea2c"). InnerVolumeSpecName "kube-api-access-5t8gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.405680 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t4k" event={"ID":"ad2674eb-8f79-42a7-8d74-906279aaea2c","Type":"ContainerDied","Data":"571bcde2c9dc4fb9608363fcb9b299dc73503fe8f962fca93113ad9955c56cc3"} Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.405739 4740 scope.go:117] "RemoveContainer" containerID="e0871ae4e564e7de56b73fdbb19cb91efc6cb3cf0e70bbb1afc0eb2138aa208c" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.405810 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6t4k" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.421761 4740 scope.go:117] "RemoveContainer" containerID="bfa0d7432a83b07f9c85d0b63f8941adff0c4d78e5a3c3fd293fbf88f1cbb138" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.438125 4740 scope.go:117] "RemoveContainer" containerID="3dcf1f9127cec8fdb8f882fa4877d383e0101d6a1342536db9c46cb11a31903c" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.474702 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad2674eb-8f79-42a7-8d74-906279aaea2c" (UID: "ad2674eb-8f79-42a7-8d74-906279aaea2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.477596 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t8gm\" (UniqueName: \"kubernetes.io/projected/ad2674eb-8f79-42a7-8d74-906279aaea2c-kube-api-access-5t8gm\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.477634 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.477648 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad2674eb-8f79-42a7-8d74-906279aaea2c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.599414 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpnqw"] Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.599643 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mpnqw" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="registry-server" containerID="cri-o://03f65739374954efa957ac3eb917c3385211843718353eb68b12be833a94bc39" gracePeriod=2 Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.766251 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" path="/var/lib/kubelet/pods/12c19a62-c9c0-4895-923b-4ac55e0f7c90/volumes" Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.767582 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c6t4k"] Oct 09 10:31:15 crc kubenswrapper[4740]: I1009 10:31:15.767650 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c6t4k"] Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.415594 4740 generic.go:334] "Generic (PLEG): container finished" podID="f6917089-254a-407e-ab54-a8085317ff82" containerID="03f65739374954efa957ac3eb917c3385211843718353eb68b12be833a94bc39" exitCode=0 Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.415675 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpnqw" event={"ID":"f6917089-254a-407e-ab54-a8085317ff82","Type":"ContainerDied","Data":"03f65739374954efa957ac3eb917c3385211843718353eb68b12be833a94bc39"} Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.814650 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.898142 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txz6h\" (UniqueName: \"kubernetes.io/projected/f6917089-254a-407e-ab54-a8085317ff82-kube-api-access-txz6h\") pod \"f6917089-254a-407e-ab54-a8085317ff82\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.898265 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-catalog-content\") pod \"f6917089-254a-407e-ab54-a8085317ff82\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.898293 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-utilities\") pod \"f6917089-254a-407e-ab54-a8085317ff82\" (UID: \"f6917089-254a-407e-ab54-a8085317ff82\") " Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.899240 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-utilities" (OuterVolumeSpecName: "utilities") pod "f6917089-254a-407e-ab54-a8085317ff82" (UID: "f6917089-254a-407e-ab54-a8085317ff82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.907924 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6917089-254a-407e-ab54-a8085317ff82-kube-api-access-txz6h" (OuterVolumeSpecName: "kube-api-access-txz6h") pod "f6917089-254a-407e-ab54-a8085317ff82" (UID: "f6917089-254a-407e-ab54-a8085317ff82"). InnerVolumeSpecName "kube-api-access-txz6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:31:16 crc kubenswrapper[4740]: I1009 10:31:16.910819 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6917089-254a-407e-ab54-a8085317ff82" (UID: "f6917089-254a-407e-ab54-a8085317ff82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.000075 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txz6h\" (UniqueName: \"kubernetes.io/projected/f6917089-254a-407e-ab54-a8085317ff82-kube-api-access-txz6h\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.000326 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.000406 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6917089-254a-407e-ab54-a8085317ff82-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.422279 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpnqw" event={"ID":"f6917089-254a-407e-ab54-a8085317ff82","Type":"ContainerDied","Data":"4aa133ec7523efd438148a75d74465e73748d47e48146e429ae304fa194fa603"} Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.422378 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpnqw" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.422554 4740 scope.go:117] "RemoveContainer" containerID="03f65739374954efa957ac3eb917c3385211843718353eb68b12be833a94bc39" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.442071 4740 scope.go:117] "RemoveContainer" containerID="ab02d6a942ae97069fecaadf0e614afe0a58d36b714552931661fed9dc40179a" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.456602 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpnqw"] Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.462516 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpnqw"] Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.470772 4740 scope.go:117] "RemoveContainer" containerID="eca573bf16b4a0022494769158fd04296a17ac1a2ffbfee9d4b9a85c67832e68" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.760766 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" path="/var/lib/kubelet/pods/ad2674eb-8f79-42a7-8d74-906279aaea2c/volumes" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.761323 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6917089-254a-407e-ab54-a8085317ff82" path="/var/lib/kubelet/pods/f6917089-254a-407e-ab54-a8085317ff82/volumes" Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.998600 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxdsj"] Oct 09 10:31:17 crc kubenswrapper[4740]: I1009 10:31:17.998909 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xxdsj" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerName="registry-server" containerID="cri-o://d94dcd873b368d5bc043b95c57c7e4d4475a78b9177e5dbe6601450705191d83" gracePeriod=2 Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.436441 4740 generic.go:334] "Generic (PLEG): container finished" podID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerID="d94dcd873b368d5bc043b95c57c7e4d4475a78b9177e5dbe6601450705191d83" exitCode=0 Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.436541 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxdsj" event={"ID":"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b","Type":"ContainerDied","Data":"d94dcd873b368d5bc043b95c57c7e4d4475a78b9177e5dbe6601450705191d83"} Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.733203 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.833823 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-utilities\") pod \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.833919 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-catalog-content\") pod \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.833975 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm8rm\" (UniqueName: \"kubernetes.io/projected/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-kube-api-access-wm8rm\") pod \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\" (UID: \"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b\") " Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.835556 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-utilities" (OuterVolumeSpecName: "utilities") pod "dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" (UID: "dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.850863 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-kube-api-access-wm8rm" (OuterVolumeSpecName: "kube-api-access-wm8rm") pod "dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" (UID: "dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b"). InnerVolumeSpecName "kube-api-access-wm8rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.911872 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" (UID: "dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.934898 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.934936 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:19 crc kubenswrapper[4740]: I1009 10:31:19.934953 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm8rm\" (UniqueName: \"kubernetes.io/projected/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b-kube-api-access-wm8rm\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:20 crc kubenswrapper[4740]: I1009 10:31:20.095081 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:31:20 crc kubenswrapper[4740]: I1009 10:31:20.447038 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxdsj" event={"ID":"dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b","Type":"ContainerDied","Data":"d32c81a27e6ed6646ad8abcc0fe1ab8f027c7146caa173c08c198f5e0e9b5f2a"} Oct 09 10:31:20 crc kubenswrapper[4740]: I1009 10:31:20.447148 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxdsj" Oct 09 10:31:20 crc kubenswrapper[4740]: I1009 10:31:20.447431 4740 scope.go:117] "RemoveContainer" containerID="d94dcd873b368d5bc043b95c57c7e4d4475a78b9177e5dbe6601450705191d83" Oct 09 10:31:20 crc kubenswrapper[4740]: I1009 10:31:20.463600 4740 scope.go:117] "RemoveContainer" containerID="0e231d9fdc2f75c05a8514424065e4068c397f857733027d1b3a34786884769e" Oct 09 10:31:20 crc kubenswrapper[4740]: I1009 10:31:20.475409 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxdsj"] Oct 09 10:31:20 crc kubenswrapper[4740]: I1009 10:31:20.478775 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xxdsj"] Oct 09 10:31:20 crc kubenswrapper[4740]: I1009 10:31:20.502033 4740 scope.go:117] "RemoveContainer" containerID="f01b824ad97841dbeab6440fb4ec0f58f9bf2d374c13d4f05aec0144e04418ab" Oct 09 10:31:21 crc kubenswrapper[4740]: I1009 10:31:21.760608 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" path="/var/lib/kubelet/pods/dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b/volumes" Oct 09 10:31:23 crc kubenswrapper[4740]: I1009 10:31:23.034032 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:31:23 crc kubenswrapper[4740]: I1009 10:31:23.034432 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:31:23 crc kubenswrapper[4740]: I1009 10:31:23.075411 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:31:23 crc kubenswrapper[4740]: I1009 10:31:23.497775 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:31:31 crc kubenswrapper[4740]: I1009 10:31:31.647951 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6zqw2"] Oct 09 10:31:56 crc kubenswrapper[4740]: I1009 10:31:56.673370 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" podUID="b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" containerName="oauth-openshift" containerID="cri-o://6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054" gracePeriod=15 Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.078729 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112383 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6b788bb46c-qjpgn"] Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112622 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" containerName="oauth-openshift" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112637 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" containerName="oauth-openshift" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112649 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112658 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112671 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30103fa4-b346-46dd-b1ac-478a2680acc9" containerName="pruner" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112679 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="30103fa4-b346-46dd-b1ac-478a2680acc9" containerName="pruner" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112693 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112701 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112713 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerName="extract-utilities" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112721 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerName="extract-utilities" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112731 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerName="extract-content" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112740 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerName="extract-content" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112781 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112793 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112812 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerName="extract-utilities" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112822 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerName="extract-utilities" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112838 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerName="extract-content" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112848 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerName="extract-content" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112862 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerName="extract-utilities" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112871 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerName="extract-utilities" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112881 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="extract-utilities" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112889 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="extract-utilities" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112900 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerName="extract-content" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112907 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerName="extract-content" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112916 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112924 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112934 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="extract-content" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112942 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="extract-content" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.112951 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1acaf9-1d30-4def-ab26-b58667065c61" containerName="pruner" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.112960 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1acaf9-1d30-4def-ab26-b58667065c61" containerName="pruner" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.113078 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1acaf9-1d30-4def-ab26-b58667065c61" containerName="pruner" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.113093 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbfab557-0ba2-49b1-88ed-d9c9a23a0e6b" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.113104 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="30103fa4-b346-46dd-b1ac-478a2680acc9" containerName="pruner" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.113117 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6917089-254a-407e-ab54-a8085317ff82" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.113128 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" containerName="oauth-openshift" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.113141 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2674eb-8f79-42a7-8d74-906279aaea2c" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.113155 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c19a62-c9c0-4895-923b-4ac55e0f7c90" containerName="registry-server" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.114005 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.130403 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b788bb46c-qjpgn"] Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.244297 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqxth\" (UniqueName: \"kubernetes.io/projected/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-kube-api-access-sqxth\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.244342 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-error\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.244371 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-policies\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.244394 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-login\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245170 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245232 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-service-ca\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245252 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-dir\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245393 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245512 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245564 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-idp-0-file-data\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245601 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-ocp-branding-template\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245933 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-router-certs\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.245959 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-provider-selection\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246012 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-trusted-ca-bundle\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246050 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-session\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246080 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-serving-cert\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246098 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-cliconfig\") pod \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\" (UID: \"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad\") " Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246235 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246258 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7m9b\" (UniqueName: \"kubernetes.io/projected/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-kube-api-access-w7m9b\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246302 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-audit-dir\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246319 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246337 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-session\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246355 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-audit-policies\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246381 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-login\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246396 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246414 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246446 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246462 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246482 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-error\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.246506 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.247189 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.247485 4740 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.247524 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.247553 4740 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.247587 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.249759 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.250063 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.250901 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.251005 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.251133 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.251302 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.251330 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-kube-api-access-sqxth" (OuterVolumeSpecName: "kube-api-access-sqxth") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "kube-api-access-sqxth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.251484 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.259656 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" (UID: "b2d12f51-5b3d-4d6f-899f-af629cc0d4ad"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349207 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7m9b\" (UniqueName: \"kubernetes.io/projected/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-kube-api-access-w7m9b\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349298 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-audit-dir\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349345 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349388 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-session\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349438 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-audit-policies\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349493 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-login\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349529 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349574 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349604 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349636 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349666 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-error\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349713 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349798 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349831 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349905 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349928 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349949 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.349969 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350028 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqxth\" (UniqueName: \"kubernetes.io/projected/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-kube-api-access-sqxth\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350051 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350070 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350088 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350110 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350129 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350148 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350882 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-audit-policies\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350916 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.350921 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-audit-dir\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.351439 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.352082 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.353268 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.353322 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.353482 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.353775 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.354058 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-login\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.355536 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-error\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.355801 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.357501 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-v4-0-config-system-session\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.365771 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7m9b\" (UniqueName: \"kubernetes.io/projected/6da19a06-1cc5-48d0-a5b1-2ce429bc1d43-kube-api-access-w7m9b\") pod \"oauth-openshift-6b788bb46c-qjpgn\" (UID: \"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43\") " pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.457253 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.645124 4740 generic.go:334] "Generic (PLEG): container finished" podID="b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" containerID="6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054" exitCode=0 Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.645236 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.645270 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" event={"ID":"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad","Type":"ContainerDied","Data":"6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054"} Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.645631 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6zqw2" event={"ID":"b2d12f51-5b3d-4d6f-899f-af629cc0d4ad","Type":"ContainerDied","Data":"9638a5deb24f05fa2722fbe0c62d1dd065fec6d3f201967fc8bf9963ae2befd9"} Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.645653 4740 scope.go:117] "RemoveContainer" containerID="6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.645721 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b788bb46c-qjpgn"] Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.666435 4740 scope.go:117] "RemoveContainer" containerID="6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054" Oct 09 10:31:57 crc kubenswrapper[4740]: E1009 10:31:57.666835 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054\": container with ID starting with 6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054 not found: ID does not exist" containerID="6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.666863 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054"} err="failed to get container status \"6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054\": rpc error: code = NotFound desc = could not find container \"6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054\": container with ID starting with 6970bbe6018e4ea1e2edd85148d5bdd90d12742526de084ef529dd869f9d6054 not found: ID does not exist" Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.676246 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6zqw2"] Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.678914 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6zqw2"] Oct 09 10:31:57 crc kubenswrapper[4740]: I1009 10:31:57.771887 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2d12f51-5b3d-4d6f-899f-af629cc0d4ad" path="/var/lib/kubelet/pods/b2d12f51-5b3d-4d6f-899f-af629cc0d4ad/volumes" Oct 09 10:31:58 crc kubenswrapper[4740]: I1009 10:31:58.653173 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" event={"ID":"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43","Type":"ContainerStarted","Data":"6d15e6d05751df3dd07ee25bdf4debc7719772a359104a223e9d33415b82d237"} Oct 09 10:31:58 crc kubenswrapper[4740]: I1009 10:31:58.653233 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" event={"ID":"6da19a06-1cc5-48d0-a5b1-2ce429bc1d43","Type":"ContainerStarted","Data":"4027e339b9cd9f1505b5ff190927f1c3dddb0f727bfff308287dc09e14a0c40b"} Oct 09 10:31:58 crc kubenswrapper[4740]: I1009 10:31:58.653592 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:58 crc kubenswrapper[4740]: I1009 10:31:58.659613 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" Oct 09 10:31:58 crc kubenswrapper[4740]: I1009 10:31:58.685978 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b788bb46c-qjpgn" podStartSLOduration=27.68595111 podStartE2EDuration="27.68595111s" podCreationTimestamp="2025-10-09 10:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:31:58.682979 +0000 UTC m=+257.645179381" watchObservedRunningTime="2025-10-09 10:31:58.68595111 +0000 UTC m=+257.648151531" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.245608 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jckll"] Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.246896 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jckll" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerName="registry-server" containerID="cri-o://be45ce4ea72aff19cca6f8a8001dec6f8c5a001834a2e413b65c9ca29d6c784a" gracePeriod=30 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.302249 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrbjh"] Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.302538 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lrbjh" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerName="registry-server" containerID="cri-o://23f0b623e9ea42cab5f7e4b756e0107bb34f00c077d59f84cb3d49830cab4af7" gracePeriod=30 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.315226 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxkqd"] Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.315859 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" podUID="07b58f91-881e-4c94-96b6-ff6126e39824" containerName="marketplace-operator" containerID="cri-o://c3dc7acc897d77cafe46c80d73775f6c0613d79766875380b5aefb2dff390eb7" gracePeriod=30 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.328883 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l22jb"] Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.328942 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5xjh"] Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.329175 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m5xjh" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" containerName="registry-server" containerID="cri-o://2ae301d42df4039b2b39d02d335bc34265890c31fda2170845ab746fbf95d622" gracePeriod=30 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.329399 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l22jb" podUID="13877acf-3046-4702-983c-5a3fc856477c" containerName="registry-server" containerID="cri-o://c9f5b4de7cb04660eef22387d92a81341dc8ce08b9b63b2ce977850c115f51e8" gracePeriod=30 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.333059 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zlsxx"] Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.340482 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.344545 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zlsxx"] Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.471733 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m45x\" (UniqueName: \"kubernetes.io/projected/26400837-285d-412b-944c-5b1fcb42b34f-kube-api-access-6m45x\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.471815 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26400837-285d-412b-944c-5b1fcb42b34f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.471842 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26400837-285d-412b-944c-5b1fcb42b34f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.572645 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m45x\" (UniqueName: \"kubernetes.io/projected/26400837-285d-412b-944c-5b1fcb42b34f-kube-api-access-6m45x\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.572700 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26400837-285d-412b-944c-5b1fcb42b34f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.572728 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26400837-285d-412b-944c-5b1fcb42b34f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.574216 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26400837-285d-412b-944c-5b1fcb42b34f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.579521 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26400837-285d-412b-944c-5b1fcb42b34f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.588243 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m45x\" (UniqueName: \"kubernetes.io/projected/26400837-285d-412b-944c-5b1fcb42b34f-kube-api-access-6m45x\") pod \"marketplace-operator-79b997595-zlsxx\" (UID: \"26400837-285d-412b-944c-5b1fcb42b34f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.705878 4740 generic.go:334] "Generic (PLEG): container finished" podID="13877acf-3046-4702-983c-5a3fc856477c" containerID="c9f5b4de7cb04660eef22387d92a81341dc8ce08b9b63b2ce977850c115f51e8" exitCode=0 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.705967 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l22jb" event={"ID":"13877acf-3046-4702-983c-5a3fc856477c","Type":"ContainerDied","Data":"c9f5b4de7cb04660eef22387d92a81341dc8ce08b9b63b2ce977850c115f51e8"} Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.706898 4740 generic.go:334] "Generic (PLEG): container finished" podID="07b58f91-881e-4c94-96b6-ff6126e39824" containerID="c3dc7acc897d77cafe46c80d73775f6c0613d79766875380b5aefb2dff390eb7" exitCode=0 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.706936 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" event={"ID":"07b58f91-881e-4c94-96b6-ff6126e39824","Type":"ContainerDied","Data":"c3dc7acc897d77cafe46c80d73775f6c0613d79766875380b5aefb2dff390eb7"} Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.708410 4740 generic.go:334] "Generic (PLEG): container finished" podID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerID="be45ce4ea72aff19cca6f8a8001dec6f8c5a001834a2e413b65c9ca29d6c784a" exitCode=0 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.708450 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jckll" event={"ID":"e86840e3-2c55-417d-9fa9-6eccaa01ad1a","Type":"ContainerDied","Data":"be45ce4ea72aff19cca6f8a8001dec6f8c5a001834a2e413b65c9ca29d6c784a"} Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.708466 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jckll" event={"ID":"e86840e3-2c55-417d-9fa9-6eccaa01ad1a","Type":"ContainerDied","Data":"191158acb51c6bb5ad54881d0da3f1126fd8a9ac6323a725fd243b11e945bb8e"} Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.708478 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="191158acb51c6bb5ad54881d0da3f1126fd8a9ac6323a725fd243b11e945bb8e" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.710204 4740 generic.go:334] "Generic (PLEG): container finished" podID="c009e3bf-3859-4e55-95c0-dc8049291674" containerID="2ae301d42df4039b2b39d02d335bc34265890c31fda2170845ab746fbf95d622" exitCode=0 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.710240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5xjh" event={"ID":"c009e3bf-3859-4e55-95c0-dc8049291674","Type":"ContainerDied","Data":"2ae301d42df4039b2b39d02d335bc34265890c31fda2170845ab746fbf95d622"} Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.711687 4740 generic.go:334] "Generic (PLEG): container finished" podID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerID="23f0b623e9ea42cab5f7e4b756e0107bb34f00c077d59f84cb3d49830cab4af7" exitCode=0 Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.711724 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrbjh" event={"ID":"b3267f79-181b-4b3e-b0c6-eba2901bf0cc","Type":"ContainerDied","Data":"23f0b623e9ea42cab5f7e4b756e0107bb34f00c077d59f84cb3d49830cab4af7"} Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.754500 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.758493 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.778012 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.784570 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.786955 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.833283 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885583 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-trusted-ca\") pod \"07b58f91-881e-4c94-96b6-ff6126e39824\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885620 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-catalog-content\") pod \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885637 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-catalog-content\") pod \"13877acf-3046-4702-983c-5a3fc856477c\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885671 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzgkt\" (UniqueName: \"kubernetes.io/projected/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-kube-api-access-vzgkt\") pod \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885688 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd77s\" (UniqueName: \"kubernetes.io/projected/c009e3bf-3859-4e55-95c0-dc8049291674-kube-api-access-wd77s\") pod \"c009e3bf-3859-4e55-95c0-dc8049291674\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885713 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-utilities\") pod \"c009e3bf-3859-4e55-95c0-dc8049291674\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885742 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-utilities\") pod \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885774 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-operator-metrics\") pod \"07b58f91-881e-4c94-96b6-ff6126e39824\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885789 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-utilities\") pod \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885825 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-catalog-content\") pod \"c009e3bf-3859-4e55-95c0-dc8049291674\" (UID: \"c009e3bf-3859-4e55-95c0-dc8049291674\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885849 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnlgh\" (UniqueName: \"kubernetes.io/projected/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-kube-api-access-vnlgh\") pod \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\" (UID: \"b3267f79-181b-4b3e-b0c6-eba2901bf0cc\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885866 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54zft\" (UniqueName: \"kubernetes.io/projected/07b58f91-881e-4c94-96b6-ff6126e39824-kube-api-access-54zft\") pod \"07b58f91-881e-4c94-96b6-ff6126e39824\" (UID: \"07b58f91-881e-4c94-96b6-ff6126e39824\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885883 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf9rd\" (UniqueName: \"kubernetes.io/projected/13877acf-3046-4702-983c-5a3fc856477c-kube-api-access-jf9rd\") pod \"13877acf-3046-4702-983c-5a3fc856477c\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885901 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-catalog-content\") pod \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\" (UID: \"e86840e3-2c55-417d-9fa9-6eccaa01ad1a\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.885919 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-utilities\") pod \"13877acf-3046-4702-983c-5a3fc856477c\" (UID: \"13877acf-3046-4702-983c-5a3fc856477c\") " Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.887378 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-utilities" (OuterVolumeSpecName: "utilities") pod "e86840e3-2c55-417d-9fa9-6eccaa01ad1a" (UID: "e86840e3-2c55-417d-9fa9-6eccaa01ad1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.887630 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "07b58f91-881e-4c94-96b6-ff6126e39824" (UID: "07b58f91-881e-4c94-96b6-ff6126e39824"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.888066 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-utilities" (OuterVolumeSpecName: "utilities") pod "c009e3bf-3859-4e55-95c0-dc8049291674" (UID: "c009e3bf-3859-4e55-95c0-dc8049291674"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.888254 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-utilities" (OuterVolumeSpecName: "utilities") pod "13877acf-3046-4702-983c-5a3fc856477c" (UID: "13877acf-3046-4702-983c-5a3fc856477c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.889205 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-utilities" (OuterVolumeSpecName: "utilities") pod "b3267f79-181b-4b3e-b0c6-eba2901bf0cc" (UID: "b3267f79-181b-4b3e-b0c6-eba2901bf0cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.893273 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-kube-api-access-vzgkt" (OuterVolumeSpecName: "kube-api-access-vzgkt") pod "e86840e3-2c55-417d-9fa9-6eccaa01ad1a" (UID: "e86840e3-2c55-417d-9fa9-6eccaa01ad1a"). InnerVolumeSpecName "kube-api-access-vzgkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.893409 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c009e3bf-3859-4e55-95c0-dc8049291674-kube-api-access-wd77s" (OuterVolumeSpecName: "kube-api-access-wd77s") pod "c009e3bf-3859-4e55-95c0-dc8049291674" (UID: "c009e3bf-3859-4e55-95c0-dc8049291674"). InnerVolumeSpecName "kube-api-access-wd77s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.893572 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-kube-api-access-vnlgh" (OuterVolumeSpecName: "kube-api-access-vnlgh") pod "b3267f79-181b-4b3e-b0c6-eba2901bf0cc" (UID: "b3267f79-181b-4b3e-b0c6-eba2901bf0cc"). InnerVolumeSpecName "kube-api-access-vnlgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.894048 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13877acf-3046-4702-983c-5a3fc856477c-kube-api-access-jf9rd" (OuterVolumeSpecName: "kube-api-access-jf9rd") pod "13877acf-3046-4702-983c-5a3fc856477c" (UID: "13877acf-3046-4702-983c-5a3fc856477c"). InnerVolumeSpecName "kube-api-access-jf9rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.895462 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b58f91-881e-4c94-96b6-ff6126e39824-kube-api-access-54zft" (OuterVolumeSpecName: "kube-api-access-54zft") pod "07b58f91-881e-4c94-96b6-ff6126e39824" (UID: "07b58f91-881e-4c94-96b6-ff6126e39824"). InnerVolumeSpecName "kube-api-access-54zft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.898175 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "07b58f91-881e-4c94-96b6-ff6126e39824" (UID: "07b58f91-881e-4c94-96b6-ff6126e39824"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.906931 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13877acf-3046-4702-983c-5a3fc856477c" (UID: "13877acf-3046-4702-983c-5a3fc856477c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.947731 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e86840e3-2c55-417d-9fa9-6eccaa01ad1a" (UID: "e86840e3-2c55-417d-9fa9-6eccaa01ad1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.952337 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3267f79-181b-4b3e-b0c6-eba2901bf0cc" (UID: "b3267f79-181b-4b3e-b0c6-eba2901bf0cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987356 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987390 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987400 4740 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987409 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987419 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnlgh\" (UniqueName: \"kubernetes.io/projected/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-kube-api-access-vnlgh\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987428 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54zft\" (UniqueName: \"kubernetes.io/projected/07b58f91-881e-4c94-96b6-ff6126e39824-kube-api-access-54zft\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987438 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf9rd\" (UniqueName: \"kubernetes.io/projected/13877acf-3046-4702-983c-5a3fc856477c-kube-api-access-jf9rd\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987445 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987453 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987460 4740 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07b58f91-881e-4c94-96b6-ff6126e39824-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987468 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3267f79-181b-4b3e-b0c6-eba2901bf0cc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987476 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13877acf-3046-4702-983c-5a3fc856477c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987484 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzgkt\" (UniqueName: \"kubernetes.io/projected/e86840e3-2c55-417d-9fa9-6eccaa01ad1a-kube-api-access-vzgkt\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.987493 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd77s\" (UniqueName: \"kubernetes.io/projected/c009e3bf-3859-4e55-95c0-dc8049291674-kube-api-access-wd77s\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:06 crc kubenswrapper[4740]: I1009 10:32:06.989963 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c009e3bf-3859-4e55-95c0-dc8049291674" (UID: "c009e3bf-3859-4e55-95c0-dc8049291674"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.088189 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c009e3bf-3859-4e55-95c0-dc8049291674-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.177370 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zlsxx"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.720440 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l22jb" event={"ID":"13877acf-3046-4702-983c-5a3fc856477c","Type":"ContainerDied","Data":"ce1d0bcff60d952bea319fa9a9efb5f279f758a83b4ef34c7ce9f0e1854a518c"} Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.720504 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l22jb" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.720541 4740 scope.go:117] "RemoveContainer" containerID="c9f5b4de7cb04660eef22387d92a81341dc8ce08b9b63b2ce977850c115f51e8" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.723372 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5xjh" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.724614 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5xjh" event={"ID":"c009e3bf-3859-4e55-95c0-dc8049291674","Type":"ContainerDied","Data":"e0d2ffdc3ca622ead0d97772a82452486e20812efedeada83868edaa2f086572"} Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.728730 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrbjh" event={"ID":"b3267f79-181b-4b3e-b0c6-eba2901bf0cc","Type":"ContainerDied","Data":"860c0f4878fdd4a94ddbb3b3cc062cb967163803b08ebba5f98f74eca2e47c9a"} Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.728865 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrbjh" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.731012 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" event={"ID":"26400837-285d-412b-944c-5b1fcb42b34f","Type":"ContainerStarted","Data":"64568289dc994d119b526054fa804278be9d0fd4af4f073ab2ca442e8dcb2b88"} Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.731059 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" event={"ID":"26400837-285d-412b-944c-5b1fcb42b34f","Type":"ContainerStarted","Data":"22311943ce4b1c114805376e4ea3a3dbd14d804dd6a1a9e747d0969bc2db351d"} Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.731279 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.732527 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jckll" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.732874 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" event={"ID":"07b58f91-881e-4c94-96b6-ff6126e39824","Type":"ContainerDied","Data":"045b5e79b53f8c78bdca56f4d17c503ab7b5327861294556149ff41f7b3e6438"} Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.732897 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zxkqd" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.739913 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.744358 4740 scope.go:117] "RemoveContainer" containerID="3c7616f12c693d606d48f235b082d4155aae6f099a1d4ca3ce5c3a22ad375c6d" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.817811 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zlsxx" podStartSLOduration=1.817737373 podStartE2EDuration="1.817737373s" podCreationTimestamp="2025-10-09 10:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:32:07.755777241 +0000 UTC m=+266.717977622" watchObservedRunningTime="2025-10-09 10:32:07.817737373 +0000 UTC m=+266.779937754" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.823874 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l22jb"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.826309 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l22jb"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.832559 4740 scope.go:117] "RemoveContainer" containerID="c47786ccd6b0f5ea879e44dc759021539a838f442b925c30ed3b5f01873d93da" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.859637 4740 scope.go:117] "RemoveContainer" containerID="2ae301d42df4039b2b39d02d335bc34265890c31fda2170845ab746fbf95d622" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.859741 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5xjh"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.877740 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m5xjh"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.885166 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jckll"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.887007 4740 scope.go:117] "RemoveContainer" containerID="ed77283d0d7834790b8aa54ab234f94261a771692b284177aeb0205ff3bf70fb" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.892047 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jckll"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.896529 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxkqd"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.899203 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxkqd"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.902403 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrbjh"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.904878 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lrbjh"] Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.913813 4740 scope.go:117] "RemoveContainer" containerID="b7baf7925be29982ae7dd84757590a33ddfdae7044c79a3386ab2c7e627978f1" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.927568 4740 scope.go:117] "RemoveContainer" containerID="23f0b623e9ea42cab5f7e4b756e0107bb34f00c077d59f84cb3d49830cab4af7" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.962980 4740 scope.go:117] "RemoveContainer" containerID="de032ebbb0a29e30b9558f11b0024b295856242043de1a730755a8cc3f52bae4" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.980480 4740 scope.go:117] "RemoveContainer" containerID="a6b7ab4b98439f5d1fc0829847983ba8c38a56db73ec1b0b4a6e6f015f03b71f" Oct 09 10:32:07 crc kubenswrapper[4740]: I1009 10:32:07.998736 4740 scope.go:117] "RemoveContainer" containerID="c3dc7acc897d77cafe46c80d73775f6c0613d79766875380b5aefb2dff390eb7" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567393 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4t5tf"] Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567630 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerName="extract-utilities" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567645 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerName="extract-utilities" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567656 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerName="extract-utilities" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567664 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerName="extract-utilities" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567677 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13877acf-3046-4702-983c-5a3fc856477c" containerName="extract-content" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567687 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="13877acf-3046-4702-983c-5a3fc856477c" containerName="extract-content" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567698 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13877acf-3046-4702-983c-5a3fc856477c" containerName="extract-utilities" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567706 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="13877acf-3046-4702-983c-5a3fc856477c" containerName="extract-utilities" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567720 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567728 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567742 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" containerName="extract-utilities" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567774 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" containerName="extract-utilities" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567795 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b58f91-881e-4c94-96b6-ff6126e39824" containerName="marketplace-operator" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567806 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b58f91-881e-4c94-96b6-ff6126e39824" containerName="marketplace-operator" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567819 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerName="extract-content" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567830 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerName="extract-content" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567842 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13877acf-3046-4702-983c-5a3fc856477c" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567849 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="13877acf-3046-4702-983c-5a3fc856477c" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567860 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567868 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567878 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" containerName="extract-content" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567886 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" containerName="extract-content" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567896 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567907 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: E1009 10:32:08.567920 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerName="extract-content" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.567930 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerName="extract-content" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.568064 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b58f91-881e-4c94-96b6-ff6126e39824" containerName="marketplace-operator" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.568089 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="13877acf-3046-4702-983c-5a3fc856477c" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.568103 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.568120 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.568137 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" containerName="registry-server" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.569116 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.572005 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.572255 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t5tf"] Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.722539 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-utilities\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.722607 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-catalog-content\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.722674 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq8j5\" (UniqueName: \"kubernetes.io/projected/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-kube-api-access-sq8j5\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.764247 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cnjss"] Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.766028 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.769997 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.776325 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cnjss"] Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.824063 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ce5306-aa76-4f28-bb81-4a37bbb283e8-catalog-content\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.824160 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fv7\" (UniqueName: \"kubernetes.io/projected/64ce5306-aa76-4f28-bb81-4a37bbb283e8-kube-api-access-d5fv7\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.824199 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-utilities\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.824220 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-catalog-content\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.824258 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ce5306-aa76-4f28-bb81-4a37bbb283e8-utilities\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.824293 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq8j5\" (UniqueName: \"kubernetes.io/projected/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-kube-api-access-sq8j5\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.825645 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-utilities\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.825908 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-catalog-content\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.844123 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq8j5\" (UniqueName: \"kubernetes.io/projected/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-kube-api-access-sq8j5\") pod \"certified-operators-4t5tf\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.886770 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.925320 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ce5306-aa76-4f28-bb81-4a37bbb283e8-catalog-content\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.925425 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fv7\" (UniqueName: \"kubernetes.io/projected/64ce5306-aa76-4f28-bb81-4a37bbb283e8-kube-api-access-d5fv7\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.925487 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ce5306-aa76-4f28-bb81-4a37bbb283e8-utilities\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.926087 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ce5306-aa76-4f28-bb81-4a37bbb283e8-catalog-content\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.926268 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ce5306-aa76-4f28-bb81-4a37bbb283e8-utilities\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:08 crc kubenswrapper[4740]: I1009 10:32:08.945976 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fv7\" (UniqueName: \"kubernetes.io/projected/64ce5306-aa76-4f28-bb81-4a37bbb283e8-kube-api-access-d5fv7\") pod \"redhat-marketplace-cnjss\" (UID: \"64ce5306-aa76-4f28-bb81-4a37bbb283e8\") " pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.094034 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.262086 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cnjss"] Oct 09 10:32:09 crc kubenswrapper[4740]: W1009 10:32:09.268705 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64ce5306_aa76_4f28_bb81_4a37bbb283e8.slice/crio-f049ff4fd8c0a15f53ab3611cce3c4273edb88a4103e22623b45fcec7bfff043 WatchSource:0}: Error finding container f049ff4fd8c0a15f53ab3611cce3c4273edb88a4103e22623b45fcec7bfff043: Status 404 returned error can't find the container with id f049ff4fd8c0a15f53ab3611cce3c4273edb88a4103e22623b45fcec7bfff043 Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.303314 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t5tf"] Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.758227 4740 generic.go:334] "Generic (PLEG): container finished" podID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerID="abc09b558e57be4ccedabd392fd19b3f81966596d978ab9a0be29576be0252a9" exitCode=0 Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.771498 4740 generic.go:334] "Generic (PLEG): container finished" podID="64ce5306-aa76-4f28-bb81-4a37bbb283e8" containerID="7dd52dc7b7012a229b5bfa5cb2582b8786b7c77e56d3593b71f8339090e832e9" exitCode=0 Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.779744 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b58f91-881e-4c94-96b6-ff6126e39824" path="/var/lib/kubelet/pods/07b58f91-881e-4c94-96b6-ff6126e39824/volumes" Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.780397 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13877acf-3046-4702-983c-5a3fc856477c" path="/var/lib/kubelet/pods/13877acf-3046-4702-983c-5a3fc856477c/volumes" Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.781220 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3267f79-181b-4b3e-b0c6-eba2901bf0cc" path="/var/lib/kubelet/pods/b3267f79-181b-4b3e-b0c6-eba2901bf0cc/volumes" Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.782703 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c009e3bf-3859-4e55-95c0-dc8049291674" path="/var/lib/kubelet/pods/c009e3bf-3859-4e55-95c0-dc8049291674/volumes" Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.783539 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86840e3-2c55-417d-9fa9-6eccaa01ad1a" path="/var/lib/kubelet/pods/e86840e3-2c55-417d-9fa9-6eccaa01ad1a/volumes" Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.784228 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t5tf" event={"ID":"726d1cc8-a024-4ec8-a89c-bb7018c6b82f","Type":"ContainerDied","Data":"abc09b558e57be4ccedabd392fd19b3f81966596d978ab9a0be29576be0252a9"} Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.784259 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t5tf" event={"ID":"726d1cc8-a024-4ec8-a89c-bb7018c6b82f","Type":"ContainerStarted","Data":"0f56ab2a782597668b30200f1d53285b7fca5d924cef6ccb726798b617c0fdb5"} Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.784275 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnjss" event={"ID":"64ce5306-aa76-4f28-bb81-4a37bbb283e8","Type":"ContainerDied","Data":"7dd52dc7b7012a229b5bfa5cb2582b8786b7c77e56d3593b71f8339090e832e9"} Oct 09 10:32:09 crc kubenswrapper[4740]: I1009 10:32:09.784290 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnjss" event={"ID":"64ce5306-aa76-4f28-bb81-4a37bbb283e8","Type":"ContainerStarted","Data":"f049ff4fd8c0a15f53ab3611cce3c4273edb88a4103e22623b45fcec7bfff043"} Oct 09 10:32:10 crc kubenswrapper[4740]: I1009 10:32:10.778650 4740 generic.go:334] "Generic (PLEG): container finished" podID="64ce5306-aa76-4f28-bb81-4a37bbb283e8" containerID="4a7f4a49a5bf899a68601ee993b917808e51a2b24fee7ae198cb883f1ab40344" exitCode=0 Oct 09 10:32:10 crc kubenswrapper[4740]: I1009 10:32:10.778692 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnjss" event={"ID":"64ce5306-aa76-4f28-bb81-4a37bbb283e8","Type":"ContainerDied","Data":"4a7f4a49a5bf899a68601ee993b917808e51a2b24fee7ae198cb883f1ab40344"} Oct 09 10:32:10 crc kubenswrapper[4740]: I1009 10:32:10.968605 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qktxn"] Oct 09 10:32:10 crc kubenswrapper[4740]: I1009 10:32:10.971282 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:10 crc kubenswrapper[4740]: I1009 10:32:10.973081 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 10:32:10 crc kubenswrapper[4740]: I1009 10:32:10.974741 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qktxn"] Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.064289 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-454q8\" (UniqueName: \"kubernetes.io/projected/310ad751-9417-4fb0-bdf2-d892a167b55f-kube-api-access-454q8\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.064729 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310ad751-9417-4fb0-bdf2-d892a167b55f-catalog-content\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.064842 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310ad751-9417-4fb0-bdf2-d892a167b55f-utilities\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.166353 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310ad751-9417-4fb0-bdf2-d892a167b55f-utilities\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.166440 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-454q8\" (UniqueName: \"kubernetes.io/projected/310ad751-9417-4fb0-bdf2-d892a167b55f-kube-api-access-454q8\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.166459 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310ad751-9417-4fb0-bdf2-d892a167b55f-catalog-content\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.166893 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310ad751-9417-4fb0-bdf2-d892a167b55f-catalog-content\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.167104 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310ad751-9417-4fb0-bdf2-d892a167b55f-utilities\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.177724 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p8hx5"] Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.179471 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.181502 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.182274 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8hx5"] Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.194241 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-454q8\" (UniqueName: \"kubernetes.io/projected/310ad751-9417-4fb0-bdf2-d892a167b55f-kube-api-access-454q8\") pod \"redhat-operators-qktxn\" (UID: \"310ad751-9417-4fb0-bdf2-d892a167b55f\") " pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.267594 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb472a61-940e-48b4-be35-bec21b4eca3c-catalog-content\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.267686 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb472a61-940e-48b4-be35-bec21b4eca3c-utilities\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.267717 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pspz9\" (UniqueName: \"kubernetes.io/projected/eb472a61-940e-48b4-be35-bec21b4eca3c-kube-api-access-pspz9\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.318457 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.368826 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb472a61-940e-48b4-be35-bec21b4eca3c-utilities\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.368886 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pspz9\" (UniqueName: \"kubernetes.io/projected/eb472a61-940e-48b4-be35-bec21b4eca3c-kube-api-access-pspz9\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.368945 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb472a61-940e-48b4-be35-bec21b4eca3c-catalog-content\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.369274 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb472a61-940e-48b4-be35-bec21b4eca3c-utilities\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.369385 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb472a61-940e-48b4-be35-bec21b4eca3c-catalog-content\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.388555 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pspz9\" (UniqueName: \"kubernetes.io/projected/eb472a61-940e-48b4-be35-bec21b4eca3c-kube-api-access-pspz9\") pod \"community-operators-p8hx5\" (UID: \"eb472a61-940e-48b4-be35-bec21b4eca3c\") " pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.516271 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.696257 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8hx5"] Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.748120 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qktxn"] Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.787787 4740 generic.go:334] "Generic (PLEG): container finished" podID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerID="70c61df985f2d051a2a3b1d4ac76854f7a2a3a6db1772a2f7781fc1241a52421" exitCode=0 Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.787870 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t5tf" event={"ID":"726d1cc8-a024-4ec8-a89c-bb7018c6b82f","Type":"ContainerDied","Data":"70c61df985f2d051a2a3b1d4ac76854f7a2a3a6db1772a2f7781fc1241a52421"} Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.791999 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnjss" event={"ID":"64ce5306-aa76-4f28-bb81-4a37bbb283e8","Type":"ContainerStarted","Data":"caa19ba8a2a38f13ebafe7b975fde55906017fb87afdb25fb9cc89fad6c6c9bc"} Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.794527 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8hx5" event={"ID":"eb472a61-940e-48b4-be35-bec21b4eca3c","Type":"ContainerStarted","Data":"b9b73cf8692b6ba8d566079c637bf266ab5f44c883f602acd0244b4ae17414d4"} Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.795573 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qktxn" event={"ID":"310ad751-9417-4fb0-bdf2-d892a167b55f","Type":"ContainerStarted","Data":"06252188ed2973082c2a332972c3f257398887fc6a9b52ed5f77d5015f09345d"} Oct 09 10:32:11 crc kubenswrapper[4740]: I1009 10:32:11.822960 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cnjss" podStartSLOduration=2.276927219 podStartE2EDuration="3.822943315s" podCreationTimestamp="2025-10-09 10:32:08 +0000 UTC" firstStartedPulling="2025-10-09 10:32:09.781973628 +0000 UTC m=+268.744174009" lastFinishedPulling="2025-10-09 10:32:11.327989724 +0000 UTC m=+270.290190105" observedRunningTime="2025-10-09 10:32:11.821058079 +0000 UTC m=+270.783258460" watchObservedRunningTime="2025-10-09 10:32:11.822943315 +0000 UTC m=+270.785143696" Oct 09 10:32:12 crc kubenswrapper[4740]: I1009 10:32:12.808157 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t5tf" event={"ID":"726d1cc8-a024-4ec8-a89c-bb7018c6b82f","Type":"ContainerStarted","Data":"05463da2ea0d184b0d4824621f1f12a6e105d6172a57519f5bc2daab52512951"} Oct 09 10:32:12 crc kubenswrapper[4740]: I1009 10:32:12.810141 4740 generic.go:334] "Generic (PLEG): container finished" podID="eb472a61-940e-48b4-be35-bec21b4eca3c" containerID="22efb2e9c95b5eb3488b6f4dc5b2a9b8b769e2054ddd6e3be79018eff2f0e0ed" exitCode=0 Oct 09 10:32:12 crc kubenswrapper[4740]: I1009 10:32:12.810210 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8hx5" event={"ID":"eb472a61-940e-48b4-be35-bec21b4eca3c","Type":"ContainerDied","Data":"22efb2e9c95b5eb3488b6f4dc5b2a9b8b769e2054ddd6e3be79018eff2f0e0ed"} Oct 09 10:32:12 crc kubenswrapper[4740]: I1009 10:32:12.811475 4740 generic.go:334] "Generic (PLEG): container finished" podID="310ad751-9417-4fb0-bdf2-d892a167b55f" containerID="f1bfe1a44444a23c6dfbf1a3c134dcef2ccf58848c26210e630f96c84ba8ebb3" exitCode=0 Oct 09 10:32:12 crc kubenswrapper[4740]: I1009 10:32:12.812352 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qktxn" event={"ID":"310ad751-9417-4fb0-bdf2-d892a167b55f","Type":"ContainerDied","Data":"f1bfe1a44444a23c6dfbf1a3c134dcef2ccf58848c26210e630f96c84ba8ebb3"} Oct 09 10:32:12 crc kubenswrapper[4740]: I1009 10:32:12.834430 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4t5tf" podStartSLOduration=2.427258712 podStartE2EDuration="4.834413569s" podCreationTimestamp="2025-10-09 10:32:08 +0000 UTC" firstStartedPulling="2025-10-09 10:32:09.782150603 +0000 UTC m=+268.744350984" lastFinishedPulling="2025-10-09 10:32:12.18930546 +0000 UTC m=+271.151505841" observedRunningTime="2025-10-09 10:32:12.831982537 +0000 UTC m=+271.794182958" watchObservedRunningTime="2025-10-09 10:32:12.834413569 +0000 UTC m=+271.796613950" Oct 09 10:32:15 crc kubenswrapper[4740]: I1009 10:32:15.832799 4740 generic.go:334] "Generic (PLEG): container finished" podID="eb472a61-940e-48b4-be35-bec21b4eca3c" containerID="251dc0167a24a808dc38cecf6f5f57f710bc2356256b5eab9080a9777fbb28cf" exitCode=0 Oct 09 10:32:15 crc kubenswrapper[4740]: I1009 10:32:15.832941 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8hx5" event={"ID":"eb472a61-940e-48b4-be35-bec21b4eca3c","Type":"ContainerDied","Data":"251dc0167a24a808dc38cecf6f5f57f710bc2356256b5eab9080a9777fbb28cf"} Oct 09 10:32:15 crc kubenswrapper[4740]: I1009 10:32:15.835899 4740 generic.go:334] "Generic (PLEG): container finished" podID="310ad751-9417-4fb0-bdf2-d892a167b55f" containerID="2e8955707d1113badefe57736749b4308238f8b46646ca7b54db8569b8d6d40a" exitCode=0 Oct 09 10:32:15 crc kubenswrapper[4740]: I1009 10:32:15.835946 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qktxn" event={"ID":"310ad751-9417-4fb0-bdf2-d892a167b55f","Type":"ContainerDied","Data":"2e8955707d1113badefe57736749b4308238f8b46646ca7b54db8569b8d6d40a"} Oct 09 10:32:16 crc kubenswrapper[4740]: I1009 10:32:16.842648 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qktxn" event={"ID":"310ad751-9417-4fb0-bdf2-d892a167b55f","Type":"ContainerStarted","Data":"ac08a5625a7ce0985b74b317e7b98111531c67cb12ee7c997282c5c5d1159696"} Oct 09 10:32:16 crc kubenswrapper[4740]: I1009 10:32:16.845739 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8hx5" event={"ID":"eb472a61-940e-48b4-be35-bec21b4eca3c","Type":"ContainerStarted","Data":"366fb55c143872976102bce235b6d9fb8f35b16e604ed3c5ae69411ae119bcbc"} Oct 09 10:32:16 crc kubenswrapper[4740]: I1009 10:32:16.860859 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qktxn" podStartSLOduration=3.324507527 podStartE2EDuration="6.860839272s" podCreationTimestamp="2025-10-09 10:32:10 +0000 UTC" firstStartedPulling="2025-10-09 10:32:12.817869907 +0000 UTC m=+271.780070288" lastFinishedPulling="2025-10-09 10:32:16.354201662 +0000 UTC m=+275.316402033" observedRunningTime="2025-10-09 10:32:16.860125401 +0000 UTC m=+275.822325802" watchObservedRunningTime="2025-10-09 10:32:16.860839272 +0000 UTC m=+275.823039653" Oct 09 10:32:16 crc kubenswrapper[4740]: I1009 10:32:16.878928 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p8hx5" podStartSLOduration=2.03532101 podStartE2EDuration="5.87890688s" podCreationTimestamp="2025-10-09 10:32:11 +0000 UTC" firstStartedPulling="2025-10-09 10:32:12.813186578 +0000 UTC m=+271.775386959" lastFinishedPulling="2025-10-09 10:32:16.656772448 +0000 UTC m=+275.618972829" observedRunningTime="2025-10-09 10:32:16.875634023 +0000 UTC m=+275.837834414" watchObservedRunningTime="2025-10-09 10:32:16.87890688 +0000 UTC m=+275.841107261" Oct 09 10:32:18 crc kubenswrapper[4740]: I1009 10:32:18.887769 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:18 crc kubenswrapper[4740]: I1009 10:32:18.888151 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:18 crc kubenswrapper[4740]: I1009 10:32:18.939588 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:19 crc kubenswrapper[4740]: I1009 10:32:19.094287 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:19 crc kubenswrapper[4740]: I1009 10:32:19.094333 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:19 crc kubenswrapper[4740]: I1009 10:32:19.138433 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:19 crc kubenswrapper[4740]: I1009 10:32:19.898443 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cnjss" Oct 09 10:32:19 crc kubenswrapper[4740]: I1009 10:32:19.906726 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:32:21 crc kubenswrapper[4740]: I1009 10:32:21.319141 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:21 crc kubenswrapper[4740]: I1009 10:32:21.319488 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:21 crc kubenswrapper[4740]: I1009 10:32:21.361027 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:21 crc kubenswrapper[4740]: I1009 10:32:21.516465 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:21 crc kubenswrapper[4740]: I1009 10:32:21.516507 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:21 crc kubenswrapper[4740]: I1009 10:32:21.555534 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:32:21 crc kubenswrapper[4740]: I1009 10:32:21.907348 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qktxn" Oct 09 10:32:21 crc kubenswrapper[4740]: I1009 10:32:21.912085 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p8hx5" Oct 09 10:33:35 crc kubenswrapper[4740]: I1009 10:33:35.407955 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:33:35 crc kubenswrapper[4740]: I1009 10:33:35.408542 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:34:05 crc kubenswrapper[4740]: I1009 10:34:05.408099 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:34:05 crc kubenswrapper[4740]: I1009 10:34:05.408664 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:34:35 crc kubenswrapper[4740]: I1009 10:34:35.407804 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:34:35 crc kubenswrapper[4740]: I1009 10:34:35.408619 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:34:35 crc kubenswrapper[4740]: I1009 10:34:35.408713 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:34:35 crc kubenswrapper[4740]: I1009 10:34:35.409630 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63604de549d11fb7c2176acf5b52492977a036f22bcf527081116471ae69cb41"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 10:34:35 crc kubenswrapper[4740]: I1009 10:34:35.409781 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://63604de549d11fb7c2176acf5b52492977a036f22bcf527081116471ae69cb41" gracePeriod=600 Oct 09 10:34:35 crc kubenswrapper[4740]: I1009 10:34:35.672609 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="63604de549d11fb7c2176acf5b52492977a036f22bcf527081116471ae69cb41" exitCode=0 Oct 09 10:34:35 crc kubenswrapper[4740]: I1009 10:34:35.672668 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"63604de549d11fb7c2176acf5b52492977a036f22bcf527081116471ae69cb41"} Oct 09 10:34:35 crc kubenswrapper[4740]: I1009 10:34:35.672722 4740 scope.go:117] "RemoveContainer" containerID="d61650cea1d7f238b29005a4b5b594045ba02d901bb86067f5e468430c1f9f6f" Oct 09 10:34:36 crc kubenswrapper[4740]: I1009 10:34:36.682669 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"db6d8672ec18a4cbb225b6946060995fa4e5d7aa11d060bcb6cd3c36cef0c580"} Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.237669 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vsvzd"] Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.239089 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.249341 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vsvzd"] Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.405589 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d48f5d7-78ed-435f-af92-6776b09991a1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.406177 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.406302 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zmt\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-kube-api-access-s9zmt\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.406406 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d48f5d7-78ed-435f-af92-6776b09991a1-registry-certificates\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.406551 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-registry-tls\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.406611 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d48f5d7-78ed-435f-af92-6776b09991a1-trusted-ca\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.406654 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d48f5d7-78ed-435f-af92-6776b09991a1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.406715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-bound-sa-token\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.431955 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.508675 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d48f5d7-78ed-435f-af92-6776b09991a1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.509201 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9zmt\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-kube-api-access-s9zmt\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.509395 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d48f5d7-78ed-435f-af92-6776b09991a1-registry-certificates\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.509572 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-registry-tls\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.509789 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d48f5d7-78ed-435f-af92-6776b09991a1-trusted-ca\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.510034 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d48f5d7-78ed-435f-af92-6776b09991a1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.510300 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-bound-sa-token\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.510853 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d48f5d7-78ed-435f-af92-6776b09991a1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.511973 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d48f5d7-78ed-435f-af92-6776b09991a1-registry-certificates\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.512264 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d48f5d7-78ed-435f-af92-6776b09991a1-trusted-ca\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.519248 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-registry-tls\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.519291 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d48f5d7-78ed-435f-af92-6776b09991a1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.540277 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9zmt\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-kube-api-access-s9zmt\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.541686 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d48f5d7-78ed-435f-af92-6776b09991a1-bound-sa-token\") pod \"image-registry-66df7c8f76-vsvzd\" (UID: \"7d48f5d7-78ed-435f-af92-6776b09991a1\") " pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.556435 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:53 crc kubenswrapper[4740]: I1009 10:35:53.726633 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vsvzd"] Oct 09 10:35:54 crc kubenswrapper[4740]: I1009 10:35:54.171976 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" event={"ID":"7d48f5d7-78ed-435f-af92-6776b09991a1","Type":"ContainerStarted","Data":"2fe26c83c1b613f748fe6f8660be6ba67c73a6ac2831b4d343764bbda766e690"} Oct 09 10:35:54 crc kubenswrapper[4740]: I1009 10:35:54.172031 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" event={"ID":"7d48f5d7-78ed-435f-af92-6776b09991a1","Type":"ContainerStarted","Data":"90fb5819157ba68c84a275e1110a724903d7fb1e170f9a68821f818b1895c923"} Oct 09 10:35:54 crc kubenswrapper[4740]: I1009 10:35:54.172145 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:35:54 crc kubenswrapper[4740]: I1009 10:35:54.196634 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" podStartSLOduration=1.196616051 podStartE2EDuration="1.196616051s" podCreationTimestamp="2025-10-09 10:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:35:54.192613139 +0000 UTC m=+493.154813540" watchObservedRunningTime="2025-10-09 10:35:54.196616051 +0000 UTC m=+493.158816432" Oct 09 10:36:13 crc kubenswrapper[4740]: I1009 10:36:13.564998 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vsvzd" Oct 09 10:36:13 crc kubenswrapper[4740]: I1009 10:36:13.664677 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5pc6m"] Oct 09 10:36:35 crc kubenswrapper[4740]: I1009 10:36:35.407483 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:36:35 crc kubenswrapper[4740]: I1009 10:36:35.408163 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:36:38 crc kubenswrapper[4740]: I1009 10:36:38.726130 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" podUID="513aa088-5f0d-479a-9668-e8ae80738297" containerName="registry" containerID="cri-o://c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964" gracePeriod=30 Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.194194 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.287518 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4xkk\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-kube-api-access-t4xkk\") pod \"513aa088-5f0d-479a-9668-e8ae80738297\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.287565 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/513aa088-5f0d-479a-9668-e8ae80738297-ca-trust-extracted\") pod \"513aa088-5f0d-479a-9668-e8ae80738297\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.287590 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-registry-certificates\") pod \"513aa088-5f0d-479a-9668-e8ae80738297\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.287617 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/513aa088-5f0d-479a-9668-e8ae80738297-installation-pull-secrets\") pod \"513aa088-5f0d-479a-9668-e8ae80738297\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.287639 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-bound-sa-token\") pod \"513aa088-5f0d-479a-9668-e8ae80738297\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.287670 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-registry-tls\") pod \"513aa088-5f0d-479a-9668-e8ae80738297\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.287914 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"513aa088-5f0d-479a-9668-e8ae80738297\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.287993 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-trusted-ca\") pod \"513aa088-5f0d-479a-9668-e8ae80738297\" (UID: \"513aa088-5f0d-479a-9668-e8ae80738297\") " Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.288949 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "513aa088-5f0d-479a-9668-e8ae80738297" (UID: "513aa088-5f0d-479a-9668-e8ae80738297"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.289197 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "513aa088-5f0d-479a-9668-e8ae80738297" (UID: "513aa088-5f0d-479a-9668-e8ae80738297"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.294298 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "513aa088-5f0d-479a-9668-e8ae80738297" (UID: "513aa088-5f0d-479a-9668-e8ae80738297"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.294670 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-kube-api-access-t4xkk" (OuterVolumeSpecName: "kube-api-access-t4xkk") pod "513aa088-5f0d-479a-9668-e8ae80738297" (UID: "513aa088-5f0d-479a-9668-e8ae80738297"). InnerVolumeSpecName "kube-api-access-t4xkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.295138 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/513aa088-5f0d-479a-9668-e8ae80738297-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "513aa088-5f0d-479a-9668-e8ae80738297" (UID: "513aa088-5f0d-479a-9668-e8ae80738297"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.295239 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "513aa088-5f0d-479a-9668-e8ae80738297" (UID: "513aa088-5f0d-479a-9668-e8ae80738297"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.298544 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "513aa088-5f0d-479a-9668-e8ae80738297" (UID: "513aa088-5f0d-479a-9668-e8ae80738297"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.307339 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/513aa088-5f0d-479a-9668-e8ae80738297-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "513aa088-5f0d-479a-9668-e8ae80738297" (UID: "513aa088-5f0d-479a-9668-e8ae80738297"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.389433 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.389487 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4xkk\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-kube-api-access-t4xkk\") on node \"crc\" DevicePath \"\"" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.389508 4740 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/513aa088-5f0d-479a-9668-e8ae80738297-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.389527 4740 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/513aa088-5f0d-479a-9668-e8ae80738297-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.389546 4740 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/513aa088-5f0d-479a-9668-e8ae80738297-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.389564 4740 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.389581 4740 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/513aa088-5f0d-479a-9668-e8ae80738297-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.444594 4740 generic.go:334] "Generic (PLEG): container finished" podID="513aa088-5f0d-479a-9668-e8ae80738297" containerID="c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964" exitCode=0 Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.444706 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.444706 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" event={"ID":"513aa088-5f0d-479a-9668-e8ae80738297","Type":"ContainerDied","Data":"c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964"} Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.445250 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5pc6m" event={"ID":"513aa088-5f0d-479a-9668-e8ae80738297","Type":"ContainerDied","Data":"7e1e75bc2b3af83f830dff506cb187a6d2e3296d5fd08e32d3e882ae2179a978"} Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.445305 4740 scope.go:117] "RemoveContainer" containerID="c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.464979 4740 scope.go:117] "RemoveContainer" containerID="c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964" Oct 09 10:36:39 crc kubenswrapper[4740]: E1009 10:36:39.465419 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964\": container with ID starting with c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964 not found: ID does not exist" containerID="c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.465487 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964"} err="failed to get container status \"c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964\": rpc error: code = NotFound desc = could not find container \"c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964\": container with ID starting with c60f1e9c5763740979f7792e714c57889d82164f1b46ebcab0acddbe7077b964 not found: ID does not exist" Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.493157 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5pc6m"] Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.497883 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5pc6m"] Oct 09 10:36:39 crc kubenswrapper[4740]: I1009 10:36:39.767002 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="513aa088-5f0d-479a-9668-e8ae80738297" path="/var/lib/kubelet/pods/513aa088-5f0d-479a-9668-e8ae80738297/volumes" Oct 09 10:36:41 crc kubenswrapper[4740]: I1009 10:36:41.903638 4740 scope.go:117] "RemoveContainer" containerID="08b90332ebc889f0f116ac85a82504bae3ff37eecf9ec9354cccc18212528473" Oct 09 10:37:05 crc kubenswrapper[4740]: I1009 10:37:05.407807 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:37:05 crc kubenswrapper[4740]: I1009 10:37:05.408566 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.408135 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.408845 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.408908 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.409801 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db6d8672ec18a4cbb225b6946060995fa4e5d7aa11d060bcb6cd3c36cef0c580"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.409909 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://db6d8672ec18a4cbb225b6946060995fa4e5d7aa11d060bcb6cd3c36cef0c580" gracePeriod=600 Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.788380 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="db6d8672ec18a4cbb225b6946060995fa4e5d7aa11d060bcb6cd3c36cef0c580" exitCode=0 Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.788639 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"db6d8672ec18a4cbb225b6946060995fa4e5d7aa11d060bcb6cd3c36cef0c580"} Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.788730 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"afadc9267ef0dcffe417993e78f8ce5f9baf0ee72c33f5f9de1c87bbb7818e64"} Oct 09 10:37:35 crc kubenswrapper[4740]: I1009 10:37:35.788777 4740 scope.go:117] "RemoveContainer" containerID="63604de549d11fb7c2176acf5b52492977a036f22bcf527081116471ae69cb41" Oct 09 10:37:41 crc kubenswrapper[4740]: I1009 10:37:41.952032 4740 scope.go:117] "RemoveContainer" containerID="c85899e4d065dc4235a56bc2bc70b1ee8953fb242b5f9f4a8629d6f1f89e107f" Oct 09 10:37:41 crc kubenswrapper[4740]: I1009 10:37:41.985244 4740 scope.go:117] "RemoveContainer" containerID="be45ce4ea72aff19cca6f8a8001dec6f8c5a001834a2e413b65c9ca29d6c784a" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.655337 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hlrtz"] Oct 09 10:38:23 crc kubenswrapper[4740]: E1009 10:38:23.656203 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513aa088-5f0d-479a-9668-e8ae80738297" containerName="registry" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.656218 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="513aa088-5f0d-479a-9668-e8ae80738297" containerName="registry" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.656327 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="513aa088-5f0d-479a-9668-e8ae80738297" containerName="registry" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.656843 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hlrtz" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.658512 4740 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6mngk" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.658970 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.659003 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.663037 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vsq7z"] Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.663911 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vsq7z" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.666265 4740 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bcsng" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.678868 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vsq7z"] Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.682038 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rdl5l"] Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.682837 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.684767 4740 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-stn5k" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.695038 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rdl5l"] Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.703571 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hlrtz"] Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.789688 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7xts\" (UniqueName: \"kubernetes.io/projected/a9ffc41f-4710-469a-bae3-ae15d4eafd9b-kube-api-access-n7xts\") pod \"cert-manager-webhook-5655c58dd6-rdl5l\" (UID: \"a9ffc41f-4710-469a-bae3-ae15d4eafd9b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.789973 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sj5s\" (UniqueName: \"kubernetes.io/projected/9b5b17c6-4d72-4295-bb2b-436b65625a66-kube-api-access-9sj5s\") pod \"cert-manager-5b446d88c5-vsq7z\" (UID: \"9b5b17c6-4d72-4295-bb2b-436b65625a66\") " pod="cert-manager/cert-manager-5b446d88c5-vsq7z" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.790105 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cqzd\" (UniqueName: \"kubernetes.io/projected/7d2a1d30-c83b-41ce-839e-3eb1f655a1c3-kube-api-access-6cqzd\") pod \"cert-manager-cainjector-7f985d654d-hlrtz\" (UID: \"7d2a1d30-c83b-41ce-839e-3eb1f655a1c3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hlrtz" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.891428 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7xts\" (UniqueName: \"kubernetes.io/projected/a9ffc41f-4710-469a-bae3-ae15d4eafd9b-kube-api-access-n7xts\") pod \"cert-manager-webhook-5655c58dd6-rdl5l\" (UID: \"a9ffc41f-4710-469a-bae3-ae15d4eafd9b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.891501 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sj5s\" (UniqueName: \"kubernetes.io/projected/9b5b17c6-4d72-4295-bb2b-436b65625a66-kube-api-access-9sj5s\") pod \"cert-manager-5b446d88c5-vsq7z\" (UID: \"9b5b17c6-4d72-4295-bb2b-436b65625a66\") " pod="cert-manager/cert-manager-5b446d88c5-vsq7z" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.891551 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cqzd\" (UniqueName: \"kubernetes.io/projected/7d2a1d30-c83b-41ce-839e-3eb1f655a1c3-kube-api-access-6cqzd\") pod \"cert-manager-cainjector-7f985d654d-hlrtz\" (UID: \"7d2a1d30-c83b-41ce-839e-3eb1f655a1c3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hlrtz" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.919625 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7xts\" (UniqueName: \"kubernetes.io/projected/a9ffc41f-4710-469a-bae3-ae15d4eafd9b-kube-api-access-n7xts\") pod \"cert-manager-webhook-5655c58dd6-rdl5l\" (UID: \"a9ffc41f-4710-469a-bae3-ae15d4eafd9b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.919625 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sj5s\" (UniqueName: \"kubernetes.io/projected/9b5b17c6-4d72-4295-bb2b-436b65625a66-kube-api-access-9sj5s\") pod \"cert-manager-5b446d88c5-vsq7z\" (UID: \"9b5b17c6-4d72-4295-bb2b-436b65625a66\") " pod="cert-manager/cert-manager-5b446d88c5-vsq7z" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.920455 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cqzd\" (UniqueName: \"kubernetes.io/projected/7d2a1d30-c83b-41ce-839e-3eb1f655a1c3-kube-api-access-6cqzd\") pod \"cert-manager-cainjector-7f985d654d-hlrtz\" (UID: \"7d2a1d30-c83b-41ce-839e-3eb1f655a1c3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hlrtz" Oct 09 10:38:23 crc kubenswrapper[4740]: I1009 10:38:23.982462 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hlrtz" Oct 09 10:38:24 crc kubenswrapper[4740]: I1009 10:38:24.002335 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vsq7z" Oct 09 10:38:24 crc kubenswrapper[4740]: I1009 10:38:24.002375 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" Oct 09 10:38:24 crc kubenswrapper[4740]: I1009 10:38:24.271096 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vsq7z"] Oct 09 10:38:24 crc kubenswrapper[4740]: W1009 10:38:24.279909 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b5b17c6_4d72_4295_bb2b_436b65625a66.slice/crio-e0c6af2684fa13a820a2810b61f44d239c38e37db565df27e47bfb92480b0c20 WatchSource:0}: Error finding container e0c6af2684fa13a820a2810b61f44d239c38e37db565df27e47bfb92480b0c20: Status 404 returned error can't find the container with id e0c6af2684fa13a820a2810b61f44d239c38e37db565df27e47bfb92480b0c20 Oct 09 10:38:24 crc kubenswrapper[4740]: I1009 10:38:24.282617 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 10:38:24 crc kubenswrapper[4740]: I1009 10:38:24.479057 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hlrtz"] Oct 09 10:38:24 crc kubenswrapper[4740]: I1009 10:38:24.487003 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rdl5l"] Oct 09 10:38:24 crc kubenswrapper[4740]: W1009 10:38:24.495117 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9ffc41f_4710_469a_bae3_ae15d4eafd9b.slice/crio-2ce31d50d005547e400bfd255eebfc662bde7e4c6446b848d80f7ccda662b8bd WatchSource:0}: Error finding container 2ce31d50d005547e400bfd255eebfc662bde7e4c6446b848d80f7ccda662b8bd: Status 404 returned error can't find the container with id 2ce31d50d005547e400bfd255eebfc662bde7e4c6446b848d80f7ccda662b8bd Oct 09 10:38:24 crc kubenswrapper[4740]: W1009 10:38:24.496112 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d2a1d30_c83b_41ce_839e_3eb1f655a1c3.slice/crio-1c3cf53a7e1b5e4d439cb50c05b6b5257ed95048d28b5bd4de18508eaf78b4fe WatchSource:0}: Error finding container 1c3cf53a7e1b5e4d439cb50c05b6b5257ed95048d28b5bd4de18508eaf78b4fe: Status 404 returned error can't find the container with id 1c3cf53a7e1b5e4d439cb50c05b6b5257ed95048d28b5bd4de18508eaf78b4fe Oct 09 10:38:25 crc kubenswrapper[4740]: I1009 10:38:25.086812 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hlrtz" event={"ID":"7d2a1d30-c83b-41ce-839e-3eb1f655a1c3","Type":"ContainerStarted","Data":"1c3cf53a7e1b5e4d439cb50c05b6b5257ed95048d28b5bd4de18508eaf78b4fe"} Oct 09 10:38:25 crc kubenswrapper[4740]: I1009 10:38:25.087880 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vsq7z" event={"ID":"9b5b17c6-4d72-4295-bb2b-436b65625a66","Type":"ContainerStarted","Data":"e0c6af2684fa13a820a2810b61f44d239c38e37db565df27e47bfb92480b0c20"} Oct 09 10:38:25 crc kubenswrapper[4740]: I1009 10:38:25.088948 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" event={"ID":"a9ffc41f-4710-469a-bae3-ae15d4eafd9b","Type":"ContainerStarted","Data":"2ce31d50d005547e400bfd255eebfc662bde7e4c6446b848d80f7ccda662b8bd"} Oct 09 10:38:28 crc kubenswrapper[4740]: I1009 10:38:28.107917 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" event={"ID":"a9ffc41f-4710-469a-bae3-ae15d4eafd9b","Type":"ContainerStarted","Data":"b93a2388932efd0eb0b1a33f3e07348c89b09914d9c72f890b2e5133c4124188"} Oct 09 10:38:28 crc kubenswrapper[4740]: I1009 10:38:28.108284 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" Oct 09 10:38:28 crc kubenswrapper[4740]: I1009 10:38:28.111096 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hlrtz" event={"ID":"7d2a1d30-c83b-41ce-839e-3eb1f655a1c3","Type":"ContainerStarted","Data":"fe1e07db9a7ee75b2d0de31bb0d57cc02bb0ef6312b02f9a6aca2fb3628f1327"} Oct 09 10:38:28 crc kubenswrapper[4740]: I1009 10:38:28.113126 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vsq7z" event={"ID":"9b5b17c6-4d72-4295-bb2b-436b65625a66","Type":"ContainerStarted","Data":"83645a54b7c8176de499e646fd35199a342b7ee6f5af9ed5176123eb0a350f79"} Oct 09 10:38:28 crc kubenswrapper[4740]: I1009 10:38:28.125288 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" podStartSLOduration=2.20676829 podStartE2EDuration="5.125265234s" podCreationTimestamp="2025-10-09 10:38:23 +0000 UTC" firstStartedPulling="2025-10-09 10:38:24.500500617 +0000 UTC m=+643.462700998" lastFinishedPulling="2025-10-09 10:38:27.418997551 +0000 UTC m=+646.381197942" observedRunningTime="2025-10-09 10:38:28.123932487 +0000 UTC m=+647.086132878" watchObservedRunningTime="2025-10-09 10:38:28.125265234 +0000 UTC m=+647.087465635" Oct 09 10:38:28 crc kubenswrapper[4740]: I1009 10:38:28.140975 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-vsq7z" podStartSLOduration=2.005155517 podStartE2EDuration="5.140959131s" podCreationTimestamp="2025-10-09 10:38:23 +0000 UTC" firstStartedPulling="2025-10-09 10:38:24.282411245 +0000 UTC m=+643.244611626" lastFinishedPulling="2025-10-09 10:38:27.418214849 +0000 UTC m=+646.380415240" observedRunningTime="2025-10-09 10:38:28.139219863 +0000 UTC m=+647.101420254" watchObservedRunningTime="2025-10-09 10:38:28.140959131 +0000 UTC m=+647.103159532" Oct 09 10:38:28 crc kubenswrapper[4740]: I1009 10:38:28.156487 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-hlrtz" podStartSLOduration=2.1662210809999998 podStartE2EDuration="5.156471413s" podCreationTimestamp="2025-10-09 10:38:23 +0000 UTC" firstStartedPulling="2025-10-09 10:38:24.498714177 +0000 UTC m=+643.460914548" lastFinishedPulling="2025-10-09 10:38:27.488964499 +0000 UTC m=+646.451164880" observedRunningTime="2025-10-09 10:38:28.152943415 +0000 UTC m=+647.115143806" watchObservedRunningTime="2025-10-09 10:38:28.156471413 +0000 UTC m=+647.118671804" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.007468 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-rdl5l" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.170035 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-klnl8"] Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.171091 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="northd" containerID="cri-o://19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9" gracePeriod=30 Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.171193 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="sbdb" containerID="cri-o://5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a" gracePeriod=30 Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.171246 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovn-acl-logging" containerID="cri-o://d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25" gracePeriod=30 Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.171223 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kube-rbac-proxy-node" containerID="cri-o://dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb" gracePeriod=30 Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.171232 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="nbdb" containerID="cri-o://9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24" gracePeriod=30 Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.171045 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovn-controller" containerID="cri-o://3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa" gracePeriod=30 Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.171229 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555" gracePeriod=30 Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.214596 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" containerID="cri-o://e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" gracePeriod=30 Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.509663 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/3.log" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.512564 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovn-acl-logging/0.log" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.513228 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovn-controller/0.log" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.513746 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581078 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jkjkz"] Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581312 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kube-rbac-proxy-node" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581333 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kube-rbac-proxy-node" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581352 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581361 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581371 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581380 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581392 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581399 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581407 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovn-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581415 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovn-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581423 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="northd" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581432 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="northd" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581443 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581450 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581458 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581465 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581475 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="nbdb" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581482 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="nbdb" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581493 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kubecfg-setup" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581500 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kubecfg-setup" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581510 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="sbdb" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581518 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="sbdb" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581529 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovn-acl-logging" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581536 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovn-acl-logging" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581683 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581700 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581710 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="nbdb" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581718 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581727 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovn-acl-logging" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581738 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581752 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="northd" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581761 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovn-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581770 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="sbdb" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581804 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581813 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="kube-rbac-proxy-node" Oct 09 10:38:34 crc kubenswrapper[4740]: E1009 10:38:34.581927 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.581937 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.582036 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" containerName="ovnkube-controller" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.583917 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.653954 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-netns\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654019 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-systemd\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654091 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-log-socket\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654119 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654146 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/192f5d73-ad53-4674-8c35-c72343c6022e-ovn-node-metrics-cert\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654191 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-log-socket" (OuterVolumeSpecName: "log-socket") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654204 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-slash\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654282 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gsjm\" (UniqueName: \"kubernetes.io/projected/192f5d73-ad53-4674-8c35-c72343c6022e-kube-api-access-6gsjm\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654327 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-systemd-units\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654368 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-node-log\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654395 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-slash" (OuterVolumeSpecName: "host-slash") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654428 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-config\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654488 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654570 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654613 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-node-log" (OuterVolumeSpecName: "node-log") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654512 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-var-lib-openvswitch\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.654967 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-kubelet\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655006 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-openvswitch\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655045 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-etc-openvswitch\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655055 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655103 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-ovn\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655112 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655130 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655163 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655180 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655228 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-netd\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655270 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-env-overrides\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655198 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655260 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655294 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-ovn-kubernetes\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655317 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655339 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-bin\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655396 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-script-lib\") pod \"192f5d73-ad53-4674-8c35-c72343c6022e\" (UID: \"192f5d73-ad53-4674-8c35-c72343c6022e\") " Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655464 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655622 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-cni-netd\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655721 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-cni-bin\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655828 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-kubelet\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655938 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-systemd-units\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655923 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.655961 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656009 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656084 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-env-overrides\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656152 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovn-node-metrics-cert\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656181 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-etc-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656207 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-systemd\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656233 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-ovn\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656254 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtn6q\" (UniqueName: \"kubernetes.io/projected/35d39da8-b79b-45f3-9df0-d3feb89bba5c-kube-api-access-gtn6q\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656275 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-run-netns\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656465 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656522 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-log-socket\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656550 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656586 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-var-lib-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656650 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-slash\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656682 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovnkube-config\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656847 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovnkube-script-lib\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656949 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-node-log\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.656971 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657092 4740 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657140 4740 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657163 4740 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657183 4740 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657203 4740 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-node-log\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657221 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657240 4740 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657258 4740 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657277 4740 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657294 4740 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657313 4740 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657334 4740 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657355 4740 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657375 4740 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657394 4740 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657414 4740 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.657434 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/192f5d73-ad53-4674-8c35-c72343c6022e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.659904 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192f5d73-ad53-4674-8c35-c72343c6022e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.660162 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192f5d73-ad53-4674-8c35-c72343c6022e-kube-api-access-6gsjm" (OuterVolumeSpecName: "kube-api-access-6gsjm") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "kube-api-access-6gsjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.670453 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "192f5d73-ad53-4674-8c35-c72343c6022e" (UID: "192f5d73-ad53-4674-8c35-c72343c6022e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758348 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-env-overrides\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758427 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovn-node-metrics-cert\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758443 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-etc-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758534 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-etc-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758469 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-systemd\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758589 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-ovn\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758607 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtn6q\" (UniqueName: \"kubernetes.io/projected/35d39da8-b79b-45f3-9df0-d3feb89bba5c-kube-api-access-gtn6q\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758631 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-run-netns\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758677 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758695 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-log-socket\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758708 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758723 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-var-lib-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758744 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-slash\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovnkube-config\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758793 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovnkube-script-lib\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758815 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-node-log\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758831 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758855 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-cni-netd\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758880 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-cni-bin\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758899 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-kubelet\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758921 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-systemd-units\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758981 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gsjm\" (UniqueName: \"kubernetes.io/projected/192f5d73-ad53-4674-8c35-c72343c6022e-kube-api-access-6gsjm\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.758997 4740 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/192f5d73-ad53-4674-8c35-c72343c6022e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759006 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/192f5d73-ad53-4674-8c35-c72343c6022e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759003 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-var-lib-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759036 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-systemd-units\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759084 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-systemd\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759090 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-slash\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759124 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-openvswitch\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759131 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-log-socket\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759128 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-run-netns\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759161 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759188 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-run-ovn\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759218 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759246 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-env-overrides\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759321 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-cni-bin\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759357 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-cni-netd\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759389 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-host-kubelet\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759423 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35d39da8-b79b-45f3-9df0-d3feb89bba5c-node-log\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759836 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovnkube-script-lib\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.759873 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovnkube-config\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.762045 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35d39da8-b79b-45f3-9df0-d3feb89bba5c-ovn-node-metrics-cert\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.777347 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtn6q\" (UniqueName: \"kubernetes.io/projected/35d39da8-b79b-45f3-9df0-d3feb89bba5c-kube-api-access-gtn6q\") pod \"ovnkube-node-jkjkz\" (UID: \"35d39da8-b79b-45f3-9df0-d3feb89bba5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:34 crc kubenswrapper[4740]: I1009 10:38:34.900157 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.168884 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovnkube-controller/3.log" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.174197 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovn-acl-logging/0.log" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175034 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-klnl8_192f5d73-ad53-4674-8c35-c72343c6022e/ovn-controller/0.log" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175551 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" exitCode=0 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175595 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a" exitCode=0 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175611 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24" exitCode=0 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175625 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9" exitCode=0 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175638 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555" exitCode=0 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175651 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb" exitCode=0 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175663 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25" exitCode=143 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175678 4740 generic.go:334] "Generic (PLEG): container finished" podID="192f5d73-ad53-4674-8c35-c72343c6022e" containerID="3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa" exitCode=143 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175658 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175746 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175852 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175914 4740 scope.go:117] "RemoveContainer" containerID="e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175919 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175958 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.175984 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176009 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176034 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176057 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176073 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176088 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176102 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176115 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176129 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176144 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176158 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176179 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176203 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176220 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176236 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176251 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176264 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176282 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176295 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176309 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176322 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176336 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176356 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176377 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176394 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176410 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176424 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176438 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176454 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176472 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176486 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176500 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176517 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176539 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-klnl8" event={"ID":"192f5d73-ad53-4674-8c35-c72343c6022e","Type":"ContainerDied","Data":"1d5b6e39d55af80cad3fa67530110d5e10a599497266b473a9a39109ef93f006"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176561 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176578 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176594 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176608 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176623 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176637 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176653 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176666 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176680 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.176694 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.184872 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/2.log" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.185966 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/1.log" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.186042 4740 generic.go:334] "Generic (PLEG): container finished" podID="73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c" containerID="291dfda6e2a2a98625a59d8fb1e8a1e9ca87c6d5b3650d8087ca2d28c0ae233c" exitCode=2 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.186166 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qrhgt" event={"ID":"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c","Type":"ContainerDied","Data":"291dfda6e2a2a98625a59d8fb1e8a1e9ca87c6d5b3650d8087ca2d28c0ae233c"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.186362 4740 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.187227 4740 scope.go:117] "RemoveContainer" containerID="291dfda6e2a2a98625a59d8fb1e8a1e9ca87c6d5b3650d8087ca2d28c0ae233c" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.187976 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qrhgt_openshift-multus(73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c)\"" pod="openshift-multus/multus-qrhgt" podUID="73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.188316 4740 generic.go:334] "Generic (PLEG): container finished" podID="35d39da8-b79b-45f3-9df0-d3feb89bba5c" containerID="38701a27e3cd3264b268c6b1ce95190200cb5bae6e516cd34b2d412893626441" exitCode=0 Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.188357 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerDied","Data":"38701a27e3cd3264b268c6b1ce95190200cb5bae6e516cd34b2d412893626441"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.188376 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"12e3c6acd47d6dc67b064581257ae294e8612c5730689485bd59ab31fdf1be05"} Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.201132 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.227808 4740 scope.go:117] "RemoveContainer" containerID="5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.250922 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-klnl8"] Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.252318 4740 scope.go:117] "RemoveContainer" containerID="9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.254016 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-klnl8"] Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.262575 4740 scope.go:117] "RemoveContainer" containerID="19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.281014 4740 scope.go:117] "RemoveContainer" containerID="59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.325494 4740 scope.go:117] "RemoveContainer" containerID="dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.341571 4740 scope.go:117] "RemoveContainer" containerID="d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.355299 4740 scope.go:117] "RemoveContainer" containerID="3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.368931 4740 scope.go:117] "RemoveContainer" containerID="04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.380897 4740 scope.go:117] "RemoveContainer" containerID="e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.381284 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": container with ID starting with e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32 not found: ID does not exist" containerID="e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.381314 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} err="failed to get container status \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": rpc error: code = NotFound desc = could not find container \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": container with ID starting with e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.381342 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.381772 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": container with ID starting with f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767 not found: ID does not exist" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.381799 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} err="failed to get container status \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": rpc error: code = NotFound desc = could not find container \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": container with ID starting with f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.381813 4740 scope.go:117] "RemoveContainer" containerID="5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.382171 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": container with ID starting with 5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a not found: ID does not exist" containerID="5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.382197 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} err="failed to get container status \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": rpc error: code = NotFound desc = could not find container \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": container with ID starting with 5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.382215 4740 scope.go:117] "RemoveContainer" containerID="9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.382565 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": container with ID starting with 9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24 not found: ID does not exist" containerID="9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.382597 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} err="failed to get container status \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": rpc error: code = NotFound desc = could not find container \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": container with ID starting with 9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.382616 4740 scope.go:117] "RemoveContainer" containerID="19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.382926 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": container with ID starting with 19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9 not found: ID does not exist" containerID="19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.382954 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} err="failed to get container status \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": rpc error: code = NotFound desc = could not find container \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": container with ID starting with 19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.382972 4740 scope.go:117] "RemoveContainer" containerID="59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.383252 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": container with ID starting with 59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555 not found: ID does not exist" containerID="59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.383279 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} err="failed to get container status \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": rpc error: code = NotFound desc = could not find container \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": container with ID starting with 59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.383299 4740 scope.go:117] "RemoveContainer" containerID="dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.383582 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": container with ID starting with dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb not found: ID does not exist" containerID="dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.383620 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} err="failed to get container status \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": rpc error: code = NotFound desc = could not find container \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": container with ID starting with dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.383638 4740 scope.go:117] "RemoveContainer" containerID="d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.383927 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": container with ID starting with d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25 not found: ID does not exist" containerID="d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.383957 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} err="failed to get container status \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": rpc error: code = NotFound desc = could not find container \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": container with ID starting with d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.383974 4740 scope.go:117] "RemoveContainer" containerID="3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.384195 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": container with ID starting with 3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa not found: ID does not exist" containerID="3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.384224 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} err="failed to get container status \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": rpc error: code = NotFound desc = could not find container \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": container with ID starting with 3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.384241 4740 scope.go:117] "RemoveContainer" containerID="04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065" Oct 09 10:38:35 crc kubenswrapper[4740]: E1009 10:38:35.384436 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": container with ID starting with 04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065 not found: ID does not exist" containerID="04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.384464 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} err="failed to get container status \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": rpc error: code = NotFound desc = could not find container \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": container with ID starting with 04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.384519 4740 scope.go:117] "RemoveContainer" containerID="e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.386372 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} err="failed to get container status \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": rpc error: code = NotFound desc = could not find container \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": container with ID starting with e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.386449 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.387104 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} err="failed to get container status \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": rpc error: code = NotFound desc = could not find container \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": container with ID starting with f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.387154 4740 scope.go:117] "RemoveContainer" containerID="5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.387572 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} err="failed to get container status \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": rpc error: code = NotFound desc = could not find container \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": container with ID starting with 5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.387594 4740 scope.go:117] "RemoveContainer" containerID="9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.387897 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} err="failed to get container status \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": rpc error: code = NotFound desc = could not find container \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": container with ID starting with 9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.387920 4740 scope.go:117] "RemoveContainer" containerID="19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.388315 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} err="failed to get container status \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": rpc error: code = NotFound desc = could not find container \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": container with ID starting with 19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.388340 4740 scope.go:117] "RemoveContainer" containerID="59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.389695 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} err="failed to get container status \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": rpc error: code = NotFound desc = could not find container \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": container with ID starting with 59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.389801 4740 scope.go:117] "RemoveContainer" containerID="dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.390326 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} err="failed to get container status \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": rpc error: code = NotFound desc = could not find container \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": container with ID starting with dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.390358 4740 scope.go:117] "RemoveContainer" containerID="d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.390679 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} err="failed to get container status \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": rpc error: code = NotFound desc = could not find container \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": container with ID starting with d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.390702 4740 scope.go:117] "RemoveContainer" containerID="3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.391985 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} err="failed to get container status \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": rpc error: code = NotFound desc = could not find container \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": container with ID starting with 3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.392022 4740 scope.go:117] "RemoveContainer" containerID="04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.392381 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} err="failed to get container status \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": rpc error: code = NotFound desc = could not find container \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": container with ID starting with 04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.392405 4740 scope.go:117] "RemoveContainer" containerID="e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.392673 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} err="failed to get container status \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": rpc error: code = NotFound desc = could not find container \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": container with ID starting with e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.392748 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.393047 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} err="failed to get container status \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": rpc error: code = NotFound desc = could not find container \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": container with ID starting with f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.393067 4740 scope.go:117] "RemoveContainer" containerID="5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.393415 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} err="failed to get container status \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": rpc error: code = NotFound desc = could not find container \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": container with ID starting with 5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.393445 4740 scope.go:117] "RemoveContainer" containerID="9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.393827 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} err="failed to get container status \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": rpc error: code = NotFound desc = could not find container \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": container with ID starting with 9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.393871 4740 scope.go:117] "RemoveContainer" containerID="19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.394222 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} err="failed to get container status \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": rpc error: code = NotFound desc = could not find container \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": container with ID starting with 19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.394252 4740 scope.go:117] "RemoveContainer" containerID="59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.394486 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} err="failed to get container status \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": rpc error: code = NotFound desc = could not find container \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": container with ID starting with 59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.394507 4740 scope.go:117] "RemoveContainer" containerID="dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.394759 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} err="failed to get container status \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": rpc error: code = NotFound desc = could not find container \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": container with ID starting with dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.394793 4740 scope.go:117] "RemoveContainer" containerID="d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395016 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} err="failed to get container status \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": rpc error: code = NotFound desc = could not find container \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": container with ID starting with d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395037 4740 scope.go:117] "RemoveContainer" containerID="3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395246 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} err="failed to get container status \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": rpc error: code = NotFound desc = could not find container \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": container with ID starting with 3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395265 4740 scope.go:117] "RemoveContainer" containerID="04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395471 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} err="failed to get container status \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": rpc error: code = NotFound desc = could not find container \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": container with ID starting with 04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395494 4740 scope.go:117] "RemoveContainer" containerID="e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395675 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} err="failed to get container status \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": rpc error: code = NotFound desc = could not find container \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": container with ID starting with e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395693 4740 scope.go:117] "RemoveContainer" containerID="f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395929 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767"} err="failed to get container status \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": rpc error: code = NotFound desc = could not find container \"f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767\": container with ID starting with f9f62608b34a7a1df32095ab06a85d044a817225231cd184010c4add85977767 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.395950 4740 scope.go:117] "RemoveContainer" containerID="5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.396117 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a"} err="failed to get container status \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": rpc error: code = NotFound desc = could not find container \"5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a\": container with ID starting with 5ad9dd1e3ca60fa1eb35d04cd799a741390e4cfab5a8fe6f3a7fc929727ecc6a not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.396138 4740 scope.go:117] "RemoveContainer" containerID="9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.396370 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24"} err="failed to get container status \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": rpc error: code = NotFound desc = could not find container \"9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24\": container with ID starting with 9ea475853f6c3f5efea91dc69378d435ff2f7b83336f9d281f26fd3558019c24 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.396415 4740 scope.go:117] "RemoveContainer" containerID="19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.396726 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9"} err="failed to get container status \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": rpc error: code = NotFound desc = could not find container \"19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9\": container with ID starting with 19fcb20a6931801e75167a61fbd8b893a35785d89041d216bc2446db8d4e9ed9 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.396747 4740 scope.go:117] "RemoveContainer" containerID="59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.397092 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555"} err="failed to get container status \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": rpc error: code = NotFound desc = could not find container \"59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555\": container with ID starting with 59d3370a8906a5fbe318cb0e84d239441b298ce7fb7f32657c99d71b9f7cb555 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.397131 4740 scope.go:117] "RemoveContainer" containerID="dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.397468 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb"} err="failed to get container status \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": rpc error: code = NotFound desc = could not find container \"dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb\": container with ID starting with dd1ec531ceb0dec24c5745a10a97afb158249fe12d53eb55043afcf20f7989eb not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.397488 4740 scope.go:117] "RemoveContainer" containerID="d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.397761 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25"} err="failed to get container status \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": rpc error: code = NotFound desc = could not find container \"d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25\": container with ID starting with d10610787b65ee6fd6a223f1818f639ec259d52ab1bf87fef721fa37ec866f25 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.397788 4740 scope.go:117] "RemoveContainer" containerID="3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.398222 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa"} err="failed to get container status \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": rpc error: code = NotFound desc = could not find container \"3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa\": container with ID starting with 3a11b411cd89a29933cd1fb5ff18284ad04c4ba47a34da329253cc291a2671fa not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.398241 4740 scope.go:117] "RemoveContainer" containerID="04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.398528 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065"} err="failed to get container status \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": rpc error: code = NotFound desc = could not find container \"04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065\": container with ID starting with 04ed9f77109c300b6b2345e8c2b85f2370f9ea37f5bd1ce53173bcce265fd065 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.398568 4740 scope.go:117] "RemoveContainer" containerID="e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.398925 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32"} err="failed to get container status \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": rpc error: code = NotFound desc = could not find container \"e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32\": container with ID starting with e312cdc1598c0a2dac22526147e91898f15bcbb872ab4e3fa567d5dcdf0e4f32 not found: ID does not exist" Oct 09 10:38:35 crc kubenswrapper[4740]: I1009 10:38:35.760132 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192f5d73-ad53-4674-8c35-c72343c6022e" path="/var/lib/kubelet/pods/192f5d73-ad53-4674-8c35-c72343c6022e/volumes" Oct 09 10:38:36 crc kubenswrapper[4740]: I1009 10:38:36.203864 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"e3ea91734774384d5480c58ae8866f0884ccbcb81a1bcde2800be86675eed784"} Oct 09 10:38:36 crc kubenswrapper[4740]: I1009 10:38:36.203910 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"4027161f4ab8d98c2709d99b9e9d76046d85144f940fd849b377f46570431367"} Oct 09 10:38:36 crc kubenswrapper[4740]: I1009 10:38:36.203923 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"33aac2e78e026e041647fb4f385044ab8a33f4a68e5c6c081f875ab3c916178e"} Oct 09 10:38:36 crc kubenswrapper[4740]: I1009 10:38:36.203932 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"e50fad640c23cbc42d5cc6c34451feb0644a9b845b7931c30186c85107abdc22"} Oct 09 10:38:36 crc kubenswrapper[4740]: I1009 10:38:36.203941 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"c61c16aa147e050d57a9f7b55a63aa85599dddd26e6196297df9b9fbe58c93bd"} Oct 09 10:38:36 crc kubenswrapper[4740]: I1009 10:38:36.203952 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"a005e1643893e6a60da117be3c46deedf0986ee1f53fc0d8782e30e1afae86a2"} Oct 09 10:38:38 crc kubenswrapper[4740]: I1009 10:38:38.221307 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"42cc906607e7644ab2b5b23162996fd33c316a44d93ee626cb0085f479af22d4"} Oct 09 10:38:41 crc kubenswrapper[4740]: I1009 10:38:41.241362 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" event={"ID":"35d39da8-b79b-45f3-9df0-d3feb89bba5c","Type":"ContainerStarted","Data":"6b106fcf25e4dbd03f02238e1b7a2ffeb17ece88caf15ae219e6529f7364be4e"} Oct 09 10:38:41 crc kubenswrapper[4740]: I1009 10:38:41.242113 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:41 crc kubenswrapper[4740]: I1009 10:38:41.242143 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:41 crc kubenswrapper[4740]: I1009 10:38:41.242161 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:41 crc kubenswrapper[4740]: I1009 10:38:41.275367 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" podStartSLOduration=7.275348879 podStartE2EDuration="7.275348879s" podCreationTimestamp="2025-10-09 10:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:38:41.270279628 +0000 UTC m=+660.232480019" watchObservedRunningTime="2025-10-09 10:38:41.275348879 +0000 UTC m=+660.237549270" Oct 09 10:38:41 crc kubenswrapper[4740]: I1009 10:38:41.281079 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:41 crc kubenswrapper[4740]: I1009 10:38:41.336120 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:38:42 crc kubenswrapper[4740]: I1009 10:38:42.036598 4740 scope.go:117] "RemoveContainer" containerID="5ed60b7e9b987350e5bfa5f576c1b11d0e02fa7c1adba23203dbfb327ce4f518" Oct 09 10:38:42 crc kubenswrapper[4740]: I1009 10:38:42.249401 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/2.log" Oct 09 10:38:49 crc kubenswrapper[4740]: I1009 10:38:49.754294 4740 scope.go:117] "RemoveContainer" containerID="291dfda6e2a2a98625a59d8fb1e8a1e9ca87c6d5b3650d8087ca2d28c0ae233c" Oct 09 10:38:49 crc kubenswrapper[4740]: E1009 10:38:49.755212 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qrhgt_openshift-multus(73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c)\"" pod="openshift-multus/multus-qrhgt" podUID="73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c" Oct 09 10:39:04 crc kubenswrapper[4740]: I1009 10:39:04.753562 4740 scope.go:117] "RemoveContainer" containerID="291dfda6e2a2a98625a59d8fb1e8a1e9ca87c6d5b3650d8087ca2d28c0ae233c" Oct 09 10:39:04 crc kubenswrapper[4740]: I1009 10:39:04.932573 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jkjkz" Oct 09 10:39:05 crc kubenswrapper[4740]: I1009 10:39:05.389348 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qrhgt_73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c/kube-multus/2.log" Oct 09 10:39:05 crc kubenswrapper[4740]: I1009 10:39:05.389815 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qrhgt" event={"ID":"73e2f602-0e1d-46df-9b13-6bc0ebaf9f0c","Type":"ContainerStarted","Data":"73589893be8934d5b5916fe73b18cc732e2cc1c48d21c10f455dc04b3266c0ac"} Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.681604 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh"] Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.683951 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.692063 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh"] Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.693893 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.825452 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.825519 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.825568 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz62b\" (UniqueName: \"kubernetes.io/projected/5648ded7-a244-4850-ba02-14aa59ec31f1-kube-api-access-vz62b\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.926711 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.926777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.926845 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz62b\" (UniqueName: \"kubernetes.io/projected/5648ded7-a244-4850-ba02-14aa59ec31f1-kube-api-access-vz62b\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.927664 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.927855 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:14 crc kubenswrapper[4740]: I1009 10:39:14.947564 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz62b\" (UniqueName: \"kubernetes.io/projected/5648ded7-a244-4850-ba02-14aa59ec31f1-kube-api-access-vz62b\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:15 crc kubenswrapper[4740]: I1009 10:39:15.004574 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:15 crc kubenswrapper[4740]: I1009 10:39:15.171386 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh"] Oct 09 10:39:15 crc kubenswrapper[4740]: I1009 10:39:15.448688 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" event={"ID":"5648ded7-a244-4850-ba02-14aa59ec31f1","Type":"ContainerStarted","Data":"1a837151a361cbdde4f299714e1fd8d5aeca976e187b39255c224321bd7c9a1f"} Oct 09 10:39:15 crc kubenswrapper[4740]: I1009 10:39:15.448736 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" event={"ID":"5648ded7-a244-4850-ba02-14aa59ec31f1","Type":"ContainerStarted","Data":"6f6909857e2466e9bf013802f705d2d9a98ba0f4cc0fb871170dea43fbdde615"} Oct 09 10:39:16 crc kubenswrapper[4740]: I1009 10:39:16.456734 4740 generic.go:334] "Generic (PLEG): container finished" podID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerID="1a837151a361cbdde4f299714e1fd8d5aeca976e187b39255c224321bd7c9a1f" exitCode=0 Oct 09 10:39:16 crc kubenswrapper[4740]: I1009 10:39:16.456847 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" event={"ID":"5648ded7-a244-4850-ba02-14aa59ec31f1","Type":"ContainerDied","Data":"1a837151a361cbdde4f299714e1fd8d5aeca976e187b39255c224321bd7c9a1f"} Oct 09 10:39:18 crc kubenswrapper[4740]: I1009 10:39:18.471531 4740 generic.go:334] "Generic (PLEG): container finished" podID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerID="a0a9c79640f87f2bc62f3a8d88568cb118b226cebdeea7a435149643774e3d1e" exitCode=0 Oct 09 10:39:18 crc kubenswrapper[4740]: I1009 10:39:18.471638 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" event={"ID":"5648ded7-a244-4850-ba02-14aa59ec31f1","Type":"ContainerDied","Data":"a0a9c79640f87f2bc62f3a8d88568cb118b226cebdeea7a435149643774e3d1e"} Oct 09 10:39:19 crc kubenswrapper[4740]: I1009 10:39:19.481130 4740 generic.go:334] "Generic (PLEG): container finished" podID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerID="43649e8a12a12cc45a5b165bed1166be1b11800a622c7a744453a92110a372cd" exitCode=0 Oct 09 10:39:19 crc kubenswrapper[4740]: I1009 10:39:19.481487 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" event={"ID":"5648ded7-a244-4850-ba02-14aa59ec31f1","Type":"ContainerDied","Data":"43649e8a12a12cc45a5b165bed1166be1b11800a622c7a744453a92110a372cd"} Oct 09 10:39:20 crc kubenswrapper[4740]: I1009 10:39:20.746782 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:20 crc kubenswrapper[4740]: I1009 10:39:20.906064 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-bundle\") pod \"5648ded7-a244-4850-ba02-14aa59ec31f1\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " Oct 09 10:39:20 crc kubenswrapper[4740]: I1009 10:39:20.906128 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-util\") pod \"5648ded7-a244-4850-ba02-14aa59ec31f1\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " Oct 09 10:39:20 crc kubenswrapper[4740]: I1009 10:39:20.906168 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz62b\" (UniqueName: \"kubernetes.io/projected/5648ded7-a244-4850-ba02-14aa59ec31f1-kube-api-access-vz62b\") pod \"5648ded7-a244-4850-ba02-14aa59ec31f1\" (UID: \"5648ded7-a244-4850-ba02-14aa59ec31f1\") " Oct 09 10:39:20 crc kubenswrapper[4740]: I1009 10:39:20.907061 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-bundle" (OuterVolumeSpecName: "bundle") pod "5648ded7-a244-4850-ba02-14aa59ec31f1" (UID: "5648ded7-a244-4850-ba02-14aa59ec31f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:39:20 crc kubenswrapper[4740]: I1009 10:39:20.911882 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5648ded7-a244-4850-ba02-14aa59ec31f1-kube-api-access-vz62b" (OuterVolumeSpecName: "kube-api-access-vz62b") pod "5648ded7-a244-4850-ba02-14aa59ec31f1" (UID: "5648ded7-a244-4850-ba02-14aa59ec31f1"). InnerVolumeSpecName "kube-api-access-vz62b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:39:20 crc kubenswrapper[4740]: I1009 10:39:20.979723 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-util" (OuterVolumeSpecName: "util") pod "5648ded7-a244-4850-ba02-14aa59ec31f1" (UID: "5648ded7-a244-4850-ba02-14aa59ec31f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:39:21 crc kubenswrapper[4740]: I1009 10:39:21.008398 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:39:21 crc kubenswrapper[4740]: I1009 10:39:21.008442 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5648ded7-a244-4850-ba02-14aa59ec31f1-util\") on node \"crc\" DevicePath \"\"" Oct 09 10:39:21 crc kubenswrapper[4740]: I1009 10:39:21.008452 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz62b\" (UniqueName: \"kubernetes.io/projected/5648ded7-a244-4850-ba02-14aa59ec31f1-kube-api-access-vz62b\") on node \"crc\" DevicePath \"\"" Oct 09 10:39:21 crc kubenswrapper[4740]: I1009 10:39:21.494988 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" event={"ID":"5648ded7-a244-4850-ba02-14aa59ec31f1","Type":"ContainerDied","Data":"6f6909857e2466e9bf013802f705d2d9a98ba0f4cc0fb871170dea43fbdde615"} Oct 09 10:39:21 crc kubenswrapper[4740]: I1009 10:39:21.495042 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f6909857e2466e9bf013802f705d2d9a98ba0f4cc0fb871170dea43fbdde615" Oct 09 10:39:21 crc kubenswrapper[4740]: I1009 10:39:21.495086 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.234498 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc"] Oct 09 10:39:26 crc kubenswrapper[4740]: E1009 10:39:26.235089 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerName="extract" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.235110 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerName="extract" Oct 09 10:39:26 crc kubenswrapper[4740]: E1009 10:39:26.235131 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerName="pull" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.235142 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerName="pull" Oct 09 10:39:26 crc kubenswrapper[4740]: E1009 10:39:26.235168 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerName="util" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.235181 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerName="util" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.235340 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5648ded7-a244-4850-ba02-14aa59ec31f1" containerName="extract" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.235881 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.238673 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ssqd9" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.238771 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.238931 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.248380 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc"] Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.375862 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5z4q\" (UniqueName: \"kubernetes.io/projected/a286b66d-1660-424c-b244-d889a099262c-kube-api-access-h5z4q\") pod \"nmstate-operator-858ddd8f98-8f6tc\" (UID: \"a286b66d-1660-424c-b244-d889a099262c\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.477255 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5z4q\" (UniqueName: \"kubernetes.io/projected/a286b66d-1660-424c-b244-d889a099262c-kube-api-access-h5z4q\") pod \"nmstate-operator-858ddd8f98-8f6tc\" (UID: \"a286b66d-1660-424c-b244-d889a099262c\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.508035 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5z4q\" (UniqueName: \"kubernetes.io/projected/a286b66d-1660-424c-b244-d889a099262c-kube-api-access-h5z4q\") pod \"nmstate-operator-858ddd8f98-8f6tc\" (UID: \"a286b66d-1660-424c-b244-d889a099262c\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.551714 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc" Oct 09 10:39:26 crc kubenswrapper[4740]: I1009 10:39:26.970789 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc"] Oct 09 10:39:27 crc kubenswrapper[4740]: I1009 10:39:27.531024 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc" event={"ID":"a286b66d-1660-424c-b244-d889a099262c","Type":"ContainerStarted","Data":"b42852eab6581ec1dd4c90080f9fd99d962a32b7198a3c10e8db54cf0f5564f7"} Oct 09 10:39:29 crc kubenswrapper[4740]: I1009 10:39:29.542196 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc" event={"ID":"a286b66d-1660-424c-b244-d889a099262c","Type":"ContainerStarted","Data":"0fccf7b9d0773a36f0e045351a21d6874d6fc2eb0b449b1fcf50e07669ca0afb"} Oct 09 10:39:29 crc kubenswrapper[4740]: I1009 10:39:29.569164 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8f6tc" podStartSLOduration=1.5984772139999999 podStartE2EDuration="3.569146155s" podCreationTimestamp="2025-10-09 10:39:26 +0000 UTC" firstStartedPulling="2025-10-09 10:39:26.98220899 +0000 UTC m=+705.944409371" lastFinishedPulling="2025-10-09 10:39:28.952877931 +0000 UTC m=+707.915078312" observedRunningTime="2025-10-09 10:39:29.564207317 +0000 UTC m=+708.526407708" watchObservedRunningTime="2025-10-09 10:39:29.569146155 +0000 UTC m=+708.531346536" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.044504 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.046123 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.050335 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qbzxs" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.057835 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.058540 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.060615 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.062093 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.069238 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.084509 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-f9gt2"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.085272 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.177313 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.177969 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.179339 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.179764 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.186782 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-ovs-socket\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.186822 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc68d\" (UniqueName: \"kubernetes.io/projected/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-kube-api-access-jc68d\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.186852 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1f9567a2-9e5d-4996-a625-1bcaca30d9a9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bslx4\" (UID: \"1f9567a2-9e5d-4996-a625-1bcaca30d9a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.186882 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhl9w\" (UniqueName: \"kubernetes.io/projected/eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd-kube-api-access-xhl9w\") pod \"nmstate-metrics-fdff9cb8d-fvr8l\" (UID: \"eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.186913 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-pxf5r" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.187145 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-nmstate-lock\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.187230 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-dbus-socket\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.187270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2j6h\" (UniqueName: \"kubernetes.io/projected/1f9567a2-9e5d-4996-a625-1bcaca30d9a9-kube-api-access-m2j6h\") pod \"nmstate-webhook-6cdbc54649-bslx4\" (UID: \"1f9567a2-9e5d-4996-a625-1bcaca30d9a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.206090 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290719 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc68d\" (UniqueName: \"kubernetes.io/projected/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-kube-api-access-jc68d\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290785 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmhwb\" (UniqueName: \"kubernetes.io/projected/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-kube-api-access-rmhwb\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290816 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1f9567a2-9e5d-4996-a625-1bcaca30d9a9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bslx4\" (UID: \"1f9567a2-9e5d-4996-a625-1bcaca30d9a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290845 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhl9w\" (UniqueName: \"kubernetes.io/projected/eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd-kube-api-access-xhl9w\") pod \"nmstate-metrics-fdff9cb8d-fvr8l\" (UID: \"eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290864 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-nmstate-lock\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290887 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-dbus-socket\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290903 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2j6h\" (UniqueName: \"kubernetes.io/projected/1f9567a2-9e5d-4996-a625-1bcaca30d9a9-kube-api-access-m2j6h\") pod \"nmstate-webhook-6cdbc54649-bslx4\" (UID: \"1f9567a2-9e5d-4996-a625-1bcaca30d9a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290931 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290955 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290980 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-ovs-socket\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.290998 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-nmstate-lock\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.291030 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-ovs-socket\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.291637 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-dbus-socket\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.309573 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc68d\" (UniqueName: \"kubernetes.io/projected/9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d-kube-api-access-jc68d\") pod \"nmstate-handler-f9gt2\" (UID: \"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d\") " pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.311148 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhl9w\" (UniqueName: \"kubernetes.io/projected/eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd-kube-api-access-xhl9w\") pod \"nmstate-metrics-fdff9cb8d-fvr8l\" (UID: \"eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.315669 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1f9567a2-9e5d-4996-a625-1bcaca30d9a9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bslx4\" (UID: \"1f9567a2-9e5d-4996-a625-1bcaca30d9a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.316137 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2j6h\" (UniqueName: \"kubernetes.io/projected/1f9567a2-9e5d-4996-a625-1bcaca30d9a9-kube-api-access-m2j6h\") pod \"nmstate-webhook-6cdbc54649-bslx4\" (UID: \"1f9567a2-9e5d-4996-a625-1bcaca30d9a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.358235 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-649484bbdd-wv5jq"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.358896 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.363537 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.374230 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-649484bbdd-wv5jq"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.377192 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.392897 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmhwb\" (UniqueName: \"kubernetes.io/projected/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-kube-api-access-rmhwb\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.393261 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.393298 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.394492 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.398596 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.399082 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.407599 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.407651 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.427743 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmhwb\" (UniqueName: \"kubernetes.io/projected/edfeadfc-4f2b-4004-9a2d-98b6b8bbe448-kube-api-access-rmhwb\") pod \"nmstate-console-plugin-6b874cbd85-5x42v\" (UID: \"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.495526 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh7dv\" (UniqueName: \"kubernetes.io/projected/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-kube-api-access-gh7dv\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.495568 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-oauth-serving-cert\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.495585 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-config\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.495604 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-serving-cert\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.495619 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-service-ca\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.495652 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-oauth-config\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.495682 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-trusted-ca-bundle\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.506420 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.575582 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.585790 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-f9gt2" event={"ID":"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d","Type":"ContainerStarted","Data":"9f8502329f371ae05e9e362797547351db17f3fdc0b30d46cda4fdeaf12ccf9b"} Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.599337 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-trusted-ca-bundle\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.599413 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh7dv\" (UniqueName: \"kubernetes.io/projected/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-kube-api-access-gh7dv\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.599434 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-oauth-serving-cert\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.599450 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-config\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.599475 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-serving-cert\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.599492 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-service-ca\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.599534 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-oauth-config\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.600565 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-oauth-serving-cert\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.601359 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-service-ca\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.601381 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-config\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.602133 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-trusted-ca-bundle\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.604697 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-serving-cert\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.608369 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-console-oauth-config\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.615921 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.620481 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh7dv\" (UniqueName: \"kubernetes.io/projected/d7b49f5d-65c1-4989-9dbe-f0bbe5a90263-kube-api-access-gh7dv\") pod \"console-649484bbdd-wv5jq\" (UID: \"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263\") " pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: W1009 10:39:35.702187 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedfeadfc_4f2b_4004_9a2d_98b6b8bbe448.slice/crio-c001fd3931ee6acc205cc34615741d7bd57e19f85ce2aa96ff567f52c9217635 WatchSource:0}: Error finding container c001fd3931ee6acc205cc34615741d7bd57e19f85ce2aa96ff567f52c9217635: Status 404 returned error can't find the container with id c001fd3931ee6acc205cc34615741d7bd57e19f85ce2aa96ff567f52c9217635 Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.702536 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v"] Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.747436 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:35 crc kubenswrapper[4740]: I1009 10:39:35.952008 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-649484bbdd-wv5jq"] Oct 09 10:39:35 crc kubenswrapper[4740]: W1009 10:39:35.958605 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b49f5d_65c1_4989_9dbe_f0bbe5a90263.slice/crio-1ba15205412ba50770bdfd9e3c4f18960d9ef771345b77d555fbf05deb02518d WatchSource:0}: Error finding container 1ba15205412ba50770bdfd9e3c4f18960d9ef771345b77d555fbf05deb02518d: Status 404 returned error can't find the container with id 1ba15205412ba50770bdfd9e3c4f18960d9ef771345b77d555fbf05deb02518d Oct 09 10:39:36 crc kubenswrapper[4740]: I1009 10:39:36.593802 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" event={"ID":"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448","Type":"ContainerStarted","Data":"c001fd3931ee6acc205cc34615741d7bd57e19f85ce2aa96ff567f52c9217635"} Oct 09 10:39:36 crc kubenswrapper[4740]: I1009 10:39:36.596701 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-649484bbdd-wv5jq" event={"ID":"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263","Type":"ContainerStarted","Data":"bba886e43794813766d71bdf3ca207bb91fcd1d4a8dd2647fd32a46537749931"} Oct 09 10:39:36 crc kubenswrapper[4740]: I1009 10:39:36.596791 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-649484bbdd-wv5jq" event={"ID":"d7b49f5d-65c1-4989-9dbe-f0bbe5a90263","Type":"ContainerStarted","Data":"1ba15205412ba50770bdfd9e3c4f18960d9ef771345b77d555fbf05deb02518d"} Oct 09 10:39:36 crc kubenswrapper[4740]: I1009 10:39:36.598354 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" event={"ID":"1f9567a2-9e5d-4996-a625-1bcaca30d9a9","Type":"ContainerStarted","Data":"f3a0320f8ac102337807f893781c9a874f4074e3a5c11d486810050805ea10f7"} Oct 09 10:39:36 crc kubenswrapper[4740]: I1009 10:39:36.599700 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" event={"ID":"eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd","Type":"ContainerStarted","Data":"4687d4ac54853c6b152ab60b0026ea26deef7b44a68026342cbd55b37f0de506"} Oct 09 10:39:36 crc kubenswrapper[4740]: I1009 10:39:36.629160 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-649484bbdd-wv5jq" podStartSLOduration=1.6291332440000001 podStartE2EDuration="1.629133244s" podCreationTimestamp="2025-10-09 10:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:39:36.620650118 +0000 UTC m=+715.582850539" watchObservedRunningTime="2025-10-09 10:39:36.629133244 +0000 UTC m=+715.591333665" Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.619235 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" event={"ID":"eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd","Type":"ContainerStarted","Data":"c2da30595ed83dbe8a9c58cb5ad0e94993467f8a2bc57071b1370ade9e866e0d"} Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.621974 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-f9gt2" event={"ID":"9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d","Type":"ContainerStarted","Data":"9f3ba8894a38f970cdda2618e80e1d35bacb1c833e53505d7acf88c0064f47f0"} Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.622180 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.624418 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" event={"ID":"edfeadfc-4f2b-4004-9a2d-98b6b8bbe448","Type":"ContainerStarted","Data":"d95a11d1e9b3e7e45646f306b8e5b5d819e603b4a78d5d0148648e442e01b02a"} Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.627867 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" event={"ID":"1f9567a2-9e5d-4996-a625-1bcaca30d9a9","Type":"ContainerStarted","Data":"595e29359492f26b3ca9455b248e521ea7f9377cb60d66c1b38fd6775d4f0771"} Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.628099 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.648516 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-f9gt2" podStartSLOduration=1.215590395 podStartE2EDuration="4.648481702s" podCreationTimestamp="2025-10-09 10:39:35 +0000 UTC" firstStartedPulling="2025-10-09 10:39:35.43863552 +0000 UTC m=+714.400835891" lastFinishedPulling="2025-10-09 10:39:38.871526817 +0000 UTC m=+717.833727198" observedRunningTime="2025-10-09 10:39:39.644428099 +0000 UTC m=+718.606628500" watchObservedRunningTime="2025-10-09 10:39:39.648481702 +0000 UTC m=+718.610682083" Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.662183 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" podStartSLOduration=1.40025765 podStartE2EDuration="4.662164423s" podCreationTimestamp="2025-10-09 10:39:35 +0000 UTC" firstStartedPulling="2025-10-09 10:39:35.588090825 +0000 UTC m=+714.550291236" lastFinishedPulling="2025-10-09 10:39:38.849997628 +0000 UTC m=+717.812198009" observedRunningTime="2025-10-09 10:39:39.65990517 +0000 UTC m=+718.622105561" watchObservedRunningTime="2025-10-09 10:39:39.662164423 +0000 UTC m=+718.624364804" Oct 09 10:39:39 crc kubenswrapper[4740]: I1009 10:39:39.673677 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5x42v" podStartSLOduration=1.6069650659999999 podStartE2EDuration="4.673656222s" podCreationTimestamp="2025-10-09 10:39:35 +0000 UTC" firstStartedPulling="2025-10-09 10:39:35.703858942 +0000 UTC m=+714.666059323" lastFinishedPulling="2025-10-09 10:39:38.770550088 +0000 UTC m=+717.732750479" observedRunningTime="2025-10-09 10:39:39.671874933 +0000 UTC m=+718.634075334" watchObservedRunningTime="2025-10-09 10:39:39.673656222 +0000 UTC m=+718.635856603" Oct 09 10:39:41 crc kubenswrapper[4740]: I1009 10:39:41.648997 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" event={"ID":"eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd","Type":"ContainerStarted","Data":"7a996ed9a102ce990b049368dedc0b5e2244b5f310d9c13c1b755f9f4fbd5896"} Oct 09 10:39:41 crc kubenswrapper[4740]: I1009 10:39:41.664855 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fvr8l" podStartSLOduration=0.946638422 podStartE2EDuration="6.664837796s" podCreationTimestamp="2025-10-09 10:39:35 +0000 UTC" firstStartedPulling="2025-10-09 10:39:35.627294988 +0000 UTC m=+714.589495369" lastFinishedPulling="2025-10-09 10:39:41.345494352 +0000 UTC m=+720.307694743" observedRunningTime="2025-10-09 10:39:41.663266252 +0000 UTC m=+720.625466643" watchObservedRunningTime="2025-10-09 10:39:41.664837796 +0000 UTC m=+720.627038177" Oct 09 10:39:45 crc kubenswrapper[4740]: I1009 10:39:45.438442 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-f9gt2" Oct 09 10:39:45 crc kubenswrapper[4740]: I1009 10:39:45.748001 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:45 crc kubenswrapper[4740]: I1009 10:39:45.748105 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:45 crc kubenswrapper[4740]: I1009 10:39:45.767884 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:46 crc kubenswrapper[4740]: I1009 10:39:46.691663 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-649484bbdd-wv5jq" Oct 09 10:39:46 crc kubenswrapper[4740]: I1009 10:39:46.768164 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g68sq"] Oct 09 10:39:55 crc kubenswrapper[4740]: I1009 10:39:55.384337 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bslx4" Oct 09 10:40:05 crc kubenswrapper[4740]: I1009 10:40:05.408078 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:40:05 crc kubenswrapper[4740]: I1009 10:40:05.408418 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.482383 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x99pn"] Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.483213 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" podUID="63fc8742-b2a1-42a1-b78e-11e736801124" containerName="controller-manager" containerID="cri-o://2bec7c6d0b22b7e8cb9345a07e1b8154d087d5263def2b1522904a168664313b" gracePeriod=30 Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.602185 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g"] Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.602697 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" podUID="9063b645-eba3-4ba3-a871-23adad70136d" containerName="route-controller-manager" containerID="cri-o://d940a9a65c725e4b4e9e0b418782e958caa57d3b13d92a265a5b84c31ed76a86" gracePeriod=30 Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.830087 4740 generic.go:334] "Generic (PLEG): container finished" podID="9063b645-eba3-4ba3-a871-23adad70136d" containerID="d940a9a65c725e4b4e9e0b418782e958caa57d3b13d92a265a5b84c31ed76a86" exitCode=0 Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.830144 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" event={"ID":"9063b645-eba3-4ba3-a871-23adad70136d","Type":"ContainerDied","Data":"d940a9a65c725e4b4e9e0b418782e958caa57d3b13d92a265a5b84c31ed76a86"} Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.831516 4740 generic.go:334] "Generic (PLEG): container finished" podID="63fc8742-b2a1-42a1-b78e-11e736801124" containerID="2bec7c6d0b22b7e8cb9345a07e1b8154d087d5263def2b1522904a168664313b" exitCode=0 Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.831538 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" event={"ID":"63fc8742-b2a1-42a1-b78e-11e736801124","Type":"ContainerDied","Data":"2bec7c6d0b22b7e8cb9345a07e1b8154d087d5263def2b1522904a168664313b"} Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.937389 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:40:07 crc kubenswrapper[4740]: I1009 10:40:07.996460 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106646 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-client-ca\") pod \"63fc8742-b2a1-42a1-b78e-11e736801124\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106725 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-config\") pod \"63fc8742-b2a1-42a1-b78e-11e736801124\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106793 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc8742-b2a1-42a1-b78e-11e736801124-serving-cert\") pod \"63fc8742-b2a1-42a1-b78e-11e736801124\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106829 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hz6d\" (UniqueName: \"kubernetes.io/projected/9063b645-eba3-4ba3-a871-23adad70136d-kube-api-access-2hz6d\") pod \"9063b645-eba3-4ba3-a871-23adad70136d\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106856 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9063b645-eba3-4ba3-a871-23adad70136d-serving-cert\") pod \"9063b645-eba3-4ba3-a871-23adad70136d\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106880 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-config\") pod \"9063b645-eba3-4ba3-a871-23adad70136d\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106902 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-client-ca\") pod \"9063b645-eba3-4ba3-a871-23adad70136d\" (UID: \"9063b645-eba3-4ba3-a871-23adad70136d\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106928 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4hsd\" (UniqueName: \"kubernetes.io/projected/63fc8742-b2a1-42a1-b78e-11e736801124-kube-api-access-b4hsd\") pod \"63fc8742-b2a1-42a1-b78e-11e736801124\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.106971 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-proxy-ca-bundles\") pod \"63fc8742-b2a1-42a1-b78e-11e736801124\" (UID: \"63fc8742-b2a1-42a1-b78e-11e736801124\") " Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.107715 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-client-ca" (OuterVolumeSpecName: "client-ca") pod "63fc8742-b2a1-42a1-b78e-11e736801124" (UID: "63fc8742-b2a1-42a1-b78e-11e736801124"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.107726 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "63fc8742-b2a1-42a1-b78e-11e736801124" (UID: "63fc8742-b2a1-42a1-b78e-11e736801124"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.108058 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-client-ca" (OuterVolumeSpecName: "client-ca") pod "9063b645-eba3-4ba3-a871-23adad70136d" (UID: "9063b645-eba3-4ba3-a871-23adad70136d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.108178 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-config" (OuterVolumeSpecName: "config") pod "63fc8742-b2a1-42a1-b78e-11e736801124" (UID: "63fc8742-b2a1-42a1-b78e-11e736801124"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.108405 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-config" (OuterVolumeSpecName: "config") pod "9063b645-eba3-4ba3-a871-23adad70136d" (UID: "9063b645-eba3-4ba3-a871-23adad70136d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.114993 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63fc8742-b2a1-42a1-b78e-11e736801124-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63fc8742-b2a1-42a1-b78e-11e736801124" (UID: "63fc8742-b2a1-42a1-b78e-11e736801124"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.115015 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63fc8742-b2a1-42a1-b78e-11e736801124-kube-api-access-b4hsd" (OuterVolumeSpecName: "kube-api-access-b4hsd") pod "63fc8742-b2a1-42a1-b78e-11e736801124" (UID: "63fc8742-b2a1-42a1-b78e-11e736801124"). InnerVolumeSpecName "kube-api-access-b4hsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.115033 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9063b645-eba3-4ba3-a871-23adad70136d-kube-api-access-2hz6d" (OuterVolumeSpecName: "kube-api-access-2hz6d") pod "9063b645-eba3-4ba3-a871-23adad70136d" (UID: "9063b645-eba3-4ba3-a871-23adad70136d"). InnerVolumeSpecName "kube-api-access-2hz6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.115409 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9063b645-eba3-4ba3-a871-23adad70136d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9063b645-eba3-4ba3-a871-23adad70136d" (UID: "9063b645-eba3-4ba3-a871-23adad70136d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208478 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208510 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208521 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc8742-b2a1-42a1-b78e-11e736801124-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208529 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc8742-b2a1-42a1-b78e-11e736801124-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208538 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hz6d\" (UniqueName: \"kubernetes.io/projected/9063b645-eba3-4ba3-a871-23adad70136d-kube-api-access-2hz6d\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208548 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9063b645-eba3-4ba3-a871-23adad70136d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208556 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208565 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9063b645-eba3-4ba3-a871-23adad70136d-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.208573 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4hsd\" (UniqueName: \"kubernetes.io/projected/63fc8742-b2a1-42a1-b78e-11e736801124-kube-api-access-b4hsd\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.840539 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" event={"ID":"63fc8742-b2a1-42a1-b78e-11e736801124","Type":"ContainerDied","Data":"f6b8492ead6520298475cf579470fe39d65f6ab474484dfe8137a206e92c776c"} Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.841009 4740 scope.go:117] "RemoveContainer" containerID="2bec7c6d0b22b7e8cb9345a07e1b8154d087d5263def2b1522904a168664313b" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.840733 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x99pn" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.845556 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" event={"ID":"9063b645-eba3-4ba3-a871-23adad70136d","Type":"ContainerDied","Data":"7cbf21bcab694f2ed43fbf52d6fa51dcf64d46a36e6f18ac24ad4c949df9e215"} Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.845593 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.887353 4740 scope.go:117] "RemoveContainer" containerID="d940a9a65c725e4b4e9e0b418782e958caa57d3b13d92a265a5b84c31ed76a86" Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.887489 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g"] Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.891453 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5bn7g"] Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.897340 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x99pn"] Oct 09 10:40:08 crc kubenswrapper[4740]: I1009 10:40:08.904483 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x99pn"] Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.369547 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w"] Oct 09 10:40:09 crc kubenswrapper[4740]: E1009 10:40:09.369851 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fc8742-b2a1-42a1-b78e-11e736801124" containerName="controller-manager" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.369868 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fc8742-b2a1-42a1-b78e-11e736801124" containerName="controller-manager" Oct 09 10:40:09 crc kubenswrapper[4740]: E1009 10:40:09.369896 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9063b645-eba3-4ba3-a871-23adad70136d" containerName="route-controller-manager" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.369909 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9063b645-eba3-4ba3-a871-23adad70136d" containerName="route-controller-manager" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.370053 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9063b645-eba3-4ba3-a871-23adad70136d" containerName="route-controller-manager" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.370080 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="63fc8742-b2a1-42a1-b78e-11e736801124" containerName="controller-manager" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.370519 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.372032 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54859d675c-l58ct"] Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.372077 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.372162 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.372426 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.372693 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.372782 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.373115 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.374139 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.375170 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.380773 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.381225 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.382301 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.382428 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.382533 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.392859 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w"] Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.396389 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.398843 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54859d675c-l58ct"] Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424577 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-config\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424812 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-serving-cert\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424855 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-config\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424876 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-proxy-ca-bundles\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424898 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrqv\" (UniqueName: \"kubernetes.io/projected/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-kube-api-access-xbrqv\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424916 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-client-ca\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424930 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtv9m\" (UniqueName: \"kubernetes.io/projected/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-kube-api-access-gtv9m\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424948 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-serving-cert\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.424971 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-client-ca\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525485 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-serving-cert\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525579 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-config\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525608 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-proxy-ca-bundles\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525643 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrqv\" (UniqueName: \"kubernetes.io/projected/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-kube-api-access-xbrqv\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525668 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-client-ca\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525692 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtv9m\" (UniqueName: \"kubernetes.io/projected/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-kube-api-access-gtv9m\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525715 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-serving-cert\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525744 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-client-ca\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.525795 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-config\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.526664 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-client-ca\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.527162 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-config\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.528934 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-config\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.529072 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-client-ca\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.529319 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-proxy-ca-bundles\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.533847 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-serving-cert\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.533866 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-serving-cert\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.542858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtv9m\" (UniqueName: \"kubernetes.io/projected/e830ca8b-4596-4ee1-89f0-fcbe99ad85d4-kube-api-access-gtv9m\") pod \"route-controller-manager-585775dd7f-gfs8w\" (UID: \"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4\") " pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.547657 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrqv\" (UniqueName: \"kubernetes.io/projected/08d5d820-0282-4a0c-93b5-bd5892c1a8ed-kube-api-access-xbrqv\") pod \"controller-manager-54859d675c-l58ct\" (UID: \"08d5d820-0282-4a0c-93b5-bd5892c1a8ed\") " pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.695451 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.720959 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.768943 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63fc8742-b2a1-42a1-b78e-11e736801124" path="/var/lib/kubelet/pods/63fc8742-b2a1-42a1-b78e-11e736801124/volumes" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.770487 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9063b645-eba3-4ba3-a871-23adad70136d" path="/var/lib/kubelet/pods/9063b645-eba3-4ba3-a871-23adad70136d/volumes" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.818219 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk"] Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.819159 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.820613 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.865021 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk"] Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.930383 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.930455 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.930484 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqhcd\" (UniqueName: \"kubernetes.io/projected/3544dc93-111d-4a49-90fc-92d76ad66184-kube-api-access-xqhcd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:09 crc kubenswrapper[4740]: I1009 10:40:09.979652 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w"] Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.016669 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54859d675c-l58ct"] Oct 09 10:40:10 crc kubenswrapper[4740]: W1009 10:40:10.021990 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08d5d820_0282_4a0c_93b5_bd5892c1a8ed.slice/crio-91bf917fde5ecb3c9b53d0294ed971399b02e601791c8f9e89472b6c3cbc1cc4 WatchSource:0}: Error finding container 91bf917fde5ecb3c9b53d0294ed971399b02e601791c8f9e89472b6c3cbc1cc4: Status 404 returned error can't find the container with id 91bf917fde5ecb3c9b53d0294ed971399b02e601791c8f9e89472b6c3cbc1cc4 Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.032503 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.032580 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.032611 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqhcd\" (UniqueName: \"kubernetes.io/projected/3544dc93-111d-4a49-90fc-92d76ad66184-kube-api-access-xqhcd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.033338 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.033782 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.049360 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqhcd\" (UniqueName: \"kubernetes.io/projected/3544dc93-111d-4a49-90fc-92d76ad66184-kube-api-access-xqhcd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.170433 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.371169 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk"] Oct 09 10:40:10 crc kubenswrapper[4740]: W1009 10:40:10.376548 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3544dc93_111d_4a49_90fc_92d76ad66184.slice/crio-74a5f5694498f10b0bddb6a700a86592e7d092f85c9c40fbfa9ddf8522de3d62 WatchSource:0}: Error finding container 74a5f5694498f10b0bddb6a700a86592e7d092f85c9c40fbfa9ddf8522de3d62: Status 404 returned error can't find the container with id 74a5f5694498f10b0bddb6a700a86592e7d092f85c9c40fbfa9ddf8522de3d62 Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.868167 4740 generic.go:334] "Generic (PLEG): container finished" podID="3544dc93-111d-4a49-90fc-92d76ad66184" containerID="1b92dd2cfa8f97cfdbfe6b525f7c58e826e4dfaf2fdd2b793ceeaadafd12dc60" exitCode=0 Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.868256 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" event={"ID":"3544dc93-111d-4a49-90fc-92d76ad66184","Type":"ContainerDied","Data":"1b92dd2cfa8f97cfdbfe6b525f7c58e826e4dfaf2fdd2b793ceeaadafd12dc60"} Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.868615 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" event={"ID":"3544dc93-111d-4a49-90fc-92d76ad66184","Type":"ContainerStarted","Data":"74a5f5694498f10b0bddb6a700a86592e7d092f85c9c40fbfa9ddf8522de3d62"} Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.870342 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" event={"ID":"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4","Type":"ContainerStarted","Data":"2b542b8f5cee94044aac1295099aeba2f6722c53f20f4cc26c455bee3f031497"} Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.870376 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" event={"ID":"e830ca8b-4596-4ee1-89f0-fcbe99ad85d4","Type":"ContainerStarted","Data":"67981a544bc1a28e57c139f5d631d1d2504f4059939eb86a386a5cd07df7f38e"} Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.870698 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.871984 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" event={"ID":"08d5d820-0282-4a0c-93b5-bd5892c1a8ed","Type":"ContainerStarted","Data":"381679cba4bacbe5bfb9176145213e25f8267714d43ba3b2fc3eb2ba29782e9c"} Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.872037 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" event={"ID":"08d5d820-0282-4a0c-93b5-bd5892c1a8ed","Type":"ContainerStarted","Data":"91bf917fde5ecb3c9b53d0294ed971399b02e601791c8f9e89472b6c3cbc1cc4"} Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.872215 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.876199 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.878935 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.967007 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-585775dd7f-gfs8w" podStartSLOduration=3.96698658 podStartE2EDuration="3.96698658s" podCreationTimestamp="2025-10-09 10:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:40:10.900458299 +0000 UTC m=+749.862658690" watchObservedRunningTime="2025-10-09 10:40:10.96698658 +0000 UTC m=+749.929186961" Oct 09 10:40:10 crc kubenswrapper[4740]: I1009 10:40:10.967098 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54859d675c-l58ct" podStartSLOduration=3.967093443 podStartE2EDuration="3.967093443s" podCreationTimestamp="2025-10-09 10:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:40:10.966060864 +0000 UTC m=+749.928261235" watchObservedRunningTime="2025-10-09 10:40:10.967093443 +0000 UTC m=+749.929293824" Oct 09 10:40:11 crc kubenswrapper[4740]: I1009 10:40:11.833763 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-g68sq" podUID="6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" containerName="console" containerID="cri-o://88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d" gracePeriod=15 Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.342295 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g68sq_6fab96fb-79cd-4d15-a23a-20d1bd2d5c39/console/0.log" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.342558 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.463468 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-service-ca\") pod \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.463511 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-serving-cert\") pod \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.463540 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-trusted-ca-bundle\") pod \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.463597 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-config\") pod \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.463653 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-oauth-serving-cert\") pod \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.463675 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-oauth-config\") pod \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.463708 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v8nt\" (UniqueName: \"kubernetes.io/projected/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-kube-api-access-8v8nt\") pod \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\" (UID: \"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39\") " Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.464503 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-config" (OuterVolumeSpecName: "console-config") pod "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" (UID: "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.464539 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" (UID: "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.464665 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" (UID: "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.464966 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-service-ca" (OuterVolumeSpecName: "service-ca") pod "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" (UID: "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.468912 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-kube-api-access-8v8nt" (OuterVolumeSpecName: "kube-api-access-8v8nt") pod "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" (UID: "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39"). InnerVolumeSpecName "kube-api-access-8v8nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.469147 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" (UID: "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.470065 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" (UID: "6fab96fb-79cd-4d15-a23a-20d1bd2d5c39"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.565313 4740 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.565361 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v8nt\" (UniqueName: \"kubernetes.io/projected/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-kube-api-access-8v8nt\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.565385 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.565402 4740 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.565431 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.565449 4740 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.565466 4740 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.891942 4740 generic.go:334] "Generic (PLEG): container finished" podID="3544dc93-111d-4a49-90fc-92d76ad66184" containerID="0852af133835338e15c5bd2fdd0cd58d40547c04ca3f85ee7a049161d5cf05ab" exitCode=0 Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.892047 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" event={"ID":"3544dc93-111d-4a49-90fc-92d76ad66184","Type":"ContainerDied","Data":"0852af133835338e15c5bd2fdd0cd58d40547c04ca3f85ee7a049161d5cf05ab"} Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.895311 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g68sq_6fab96fb-79cd-4d15-a23a-20d1bd2d5c39/console/0.log" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.895359 4740 generic.go:334] "Generic (PLEG): container finished" podID="6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" containerID="88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d" exitCode=2 Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.895707 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g68sq" event={"ID":"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39","Type":"ContainerDied","Data":"88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d"} Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.895823 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g68sq" event={"ID":"6fab96fb-79cd-4d15-a23a-20d1bd2d5c39","Type":"ContainerDied","Data":"92ab555905978d118b0186ab5ba1bb24e2f46b62cd1760f6fbd3dd55209938fc"} Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.895863 4740 scope.go:117] "RemoveContainer" containerID="88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.896073 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g68sq" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.925393 4740 scope.go:117] "RemoveContainer" containerID="88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d" Oct 09 10:40:12 crc kubenswrapper[4740]: E1009 10:40:12.926042 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d\": container with ID starting with 88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d not found: ID does not exist" containerID="88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.926087 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d"} err="failed to get container status \"88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d\": rpc error: code = NotFound desc = could not find container \"88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d\": container with ID starting with 88d557bf9f6cce99aa0abfe178173c12ff37c7235c5316fe71f010acc8747b0d not found: ID does not exist" Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.947335 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g68sq"] Oct 09 10:40:12 crc kubenswrapper[4740]: I1009 10:40:12.950852 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-g68sq"] Oct 09 10:40:13 crc kubenswrapper[4740]: I1009 10:40:13.762600 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" path="/var/lib/kubelet/pods/6fab96fb-79cd-4d15-a23a-20d1bd2d5c39/volumes" Oct 09 10:40:13 crc kubenswrapper[4740]: I1009 10:40:13.904059 4740 generic.go:334] "Generic (PLEG): container finished" podID="3544dc93-111d-4a49-90fc-92d76ad66184" containerID="6539a25fb105112c8caa8e3b3dc08c4500101f509fe7f9397456d8fedb8a4b62" exitCode=0 Oct 09 10:40:13 crc kubenswrapper[4740]: I1009 10:40:13.904121 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" event={"ID":"3544dc93-111d-4a49-90fc-92d76ad66184","Type":"ContainerDied","Data":"6539a25fb105112c8caa8e3b3dc08c4500101f509fe7f9397456d8fedb8a4b62"} Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.163082 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.301245 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqhcd\" (UniqueName: \"kubernetes.io/projected/3544dc93-111d-4a49-90fc-92d76ad66184-kube-api-access-xqhcd\") pod \"3544dc93-111d-4a49-90fc-92d76ad66184\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.301359 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-bundle\") pod \"3544dc93-111d-4a49-90fc-92d76ad66184\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.301444 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-util\") pod \"3544dc93-111d-4a49-90fc-92d76ad66184\" (UID: \"3544dc93-111d-4a49-90fc-92d76ad66184\") " Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.303236 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-bundle" (OuterVolumeSpecName: "bundle") pod "3544dc93-111d-4a49-90fc-92d76ad66184" (UID: "3544dc93-111d-4a49-90fc-92d76ad66184"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.309900 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3544dc93-111d-4a49-90fc-92d76ad66184-kube-api-access-xqhcd" (OuterVolumeSpecName: "kube-api-access-xqhcd") pod "3544dc93-111d-4a49-90fc-92d76ad66184" (UID: "3544dc93-111d-4a49-90fc-92d76ad66184"). InnerVolumeSpecName "kube-api-access-xqhcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.329964 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-util" (OuterVolumeSpecName: "util") pod "3544dc93-111d-4a49-90fc-92d76ad66184" (UID: "3544dc93-111d-4a49-90fc-92d76ad66184"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.402361 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-util\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.402396 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqhcd\" (UniqueName: \"kubernetes.io/projected/3544dc93-111d-4a49-90fc-92d76ad66184-kube-api-access-xqhcd\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.402409 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3544dc93-111d-4a49-90fc-92d76ad66184-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.922950 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" event={"ID":"3544dc93-111d-4a49-90fc-92d76ad66184","Type":"ContainerDied","Data":"74a5f5694498f10b0bddb6a700a86592e7d092f85c9c40fbfa9ddf8522de3d62"} Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.923008 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74a5f5694498f10b0bddb6a700a86592e7d092f85c9c40fbfa9ddf8522de3d62" Oct 09 10:40:15 crc kubenswrapper[4740]: I1009 10:40:15.923101 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.473495 4740 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.573447 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5tvt5"] Oct 09 10:40:16 crc kubenswrapper[4740]: E1009 10:40:16.573803 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3544dc93-111d-4a49-90fc-92d76ad66184" containerName="util" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.573824 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3544dc93-111d-4a49-90fc-92d76ad66184" containerName="util" Oct 09 10:40:16 crc kubenswrapper[4740]: E1009 10:40:16.573858 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" containerName="console" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.573871 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" containerName="console" Oct 09 10:40:16 crc kubenswrapper[4740]: E1009 10:40:16.573889 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3544dc93-111d-4a49-90fc-92d76ad66184" containerName="pull" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.573904 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3544dc93-111d-4a49-90fc-92d76ad66184" containerName="pull" Oct 09 10:40:16 crc kubenswrapper[4740]: E1009 10:40:16.573925 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3544dc93-111d-4a49-90fc-92d76ad66184" containerName="extract" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.573937 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3544dc93-111d-4a49-90fc-92d76ad66184" containerName="extract" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.574128 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fab96fb-79cd-4d15-a23a-20d1bd2d5c39" containerName="console" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.574144 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3544dc93-111d-4a49-90fc-92d76ad66184" containerName="extract" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.575027 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.605490 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tvt5"] Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.729945 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-utilities\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.730035 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjgv\" (UniqueName: \"kubernetes.io/projected/1f802fba-25d3-4de5-a9ff-29b564cf5920-kube-api-access-pvjgv\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.730105 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-catalog-content\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.831541 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-catalog-content\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.831887 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-utilities\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.832076 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-catalog-content\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.832095 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjgv\" (UniqueName: \"kubernetes.io/projected/1f802fba-25d3-4de5-a9ff-29b564cf5920-kube-api-access-pvjgv\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.832317 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-utilities\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.849875 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjgv\" (UniqueName: \"kubernetes.io/projected/1f802fba-25d3-4de5-a9ff-29b564cf5920-kube-api-access-pvjgv\") pod \"redhat-operators-5tvt5\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:16 crc kubenswrapper[4740]: I1009 10:40:16.899887 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:17 crc kubenswrapper[4740]: I1009 10:40:17.315454 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tvt5"] Oct 09 10:40:17 crc kubenswrapper[4740]: I1009 10:40:17.936003 4740 generic.go:334] "Generic (PLEG): container finished" podID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerID="b7d009c4ba255915501afd6719ae3540207dd054003b561c713230d9141ded4a" exitCode=0 Oct 09 10:40:17 crc kubenswrapper[4740]: I1009 10:40:17.936130 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tvt5" event={"ID":"1f802fba-25d3-4de5-a9ff-29b564cf5920","Type":"ContainerDied","Data":"b7d009c4ba255915501afd6719ae3540207dd054003b561c713230d9141ded4a"} Oct 09 10:40:17 crc kubenswrapper[4740]: I1009 10:40:17.937633 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tvt5" event={"ID":"1f802fba-25d3-4de5-a9ff-29b564cf5920","Type":"ContainerStarted","Data":"0cfe91c8e52f22c9c59976ca731a232881acd5a8c71d160bf82b1e2cdb2536d8"} Oct 09 10:40:18 crc kubenswrapper[4740]: I1009 10:40:18.947027 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tvt5" event={"ID":"1f802fba-25d3-4de5-a9ff-29b564cf5920","Type":"ContainerStarted","Data":"6a286106af5754e6d1689a25dfcd0f85beae45484173782123bc4a9cb7b5b0b7"} Oct 09 10:40:19 crc kubenswrapper[4740]: I1009 10:40:19.953492 4740 generic.go:334] "Generic (PLEG): container finished" podID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerID="6a286106af5754e6d1689a25dfcd0f85beae45484173782123bc4a9cb7b5b0b7" exitCode=0 Oct 09 10:40:19 crc kubenswrapper[4740]: I1009 10:40:19.953531 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tvt5" event={"ID":"1f802fba-25d3-4de5-a9ff-29b564cf5920","Type":"ContainerDied","Data":"6a286106af5754e6d1689a25dfcd0f85beae45484173782123bc4a9cb7b5b0b7"} Oct 09 10:40:20 crc kubenswrapper[4740]: I1009 10:40:20.961703 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tvt5" event={"ID":"1f802fba-25d3-4de5-a9ff-29b564cf5920","Type":"ContainerStarted","Data":"a52e047fe42671d561eecae8da761acae3003b8d3625761ed0fb249545ae1069"} Oct 09 10:40:20 crc kubenswrapper[4740]: I1009 10:40:20.977900 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5tvt5" podStartSLOduration=2.433057694 podStartE2EDuration="4.97788282s" podCreationTimestamp="2025-10-09 10:40:16 +0000 UTC" firstStartedPulling="2025-10-09 10:40:17.937359614 +0000 UTC m=+756.899559995" lastFinishedPulling="2025-10-09 10:40:20.48218474 +0000 UTC m=+759.444385121" observedRunningTime="2025-10-09 10:40:20.974890207 +0000 UTC m=+759.937090628" watchObservedRunningTime="2025-10-09 10:40:20.97788282 +0000 UTC m=+759.940083201" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.014993 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng"] Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.016005 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.017632 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.017669 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.018368 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.020059 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.020631 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tghsf" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.050420 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng"] Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.127211 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97758f55-d70a-4949-b056-5673a1975dd5-webhook-cert\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.127492 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97758f55-d70a-4949-b056-5673a1975dd5-apiservice-cert\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.127641 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmq8n\" (UniqueName: \"kubernetes.io/projected/97758f55-d70a-4949-b056-5673a1975dd5-kube-api-access-wmq8n\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.228609 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmq8n\" (UniqueName: \"kubernetes.io/projected/97758f55-d70a-4949-b056-5673a1975dd5-kube-api-access-wmq8n\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.228695 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97758f55-d70a-4949-b056-5673a1975dd5-webhook-cert\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.228715 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97758f55-d70a-4949-b056-5673a1975dd5-apiservice-cert\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.236562 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97758f55-d70a-4949-b056-5673a1975dd5-webhook-cert\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.237287 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97758f55-d70a-4949-b056-5673a1975dd5-apiservice-cert\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.251866 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmq8n\" (UniqueName: \"kubernetes.io/projected/97758f55-d70a-4949-b056-5673a1975dd5-kube-api-access-wmq8n\") pod \"metallb-operator-controller-manager-7698d6f4f4-hrnng\" (UID: \"97758f55-d70a-4949-b056-5673a1975dd5\") " pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.330126 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.359704 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6749d4b858-4656g"] Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.360390 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.364298 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.364401 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.364489 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-scvv7" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.449626 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6749d4b858-4656g"] Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.537133 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzmkk\" (UniqueName: \"kubernetes.io/projected/9f363b47-14ab-4719-93dd-db269dc8f132-kube-api-access-zzmkk\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.537228 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f363b47-14ab-4719-93dd-db269dc8f132-apiservice-cert\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.537281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f363b47-14ab-4719-93dd-db269dc8f132-webhook-cert\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.638142 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzmkk\" (UniqueName: \"kubernetes.io/projected/9f363b47-14ab-4719-93dd-db269dc8f132-kube-api-access-zzmkk\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.638217 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f363b47-14ab-4719-93dd-db269dc8f132-apiservice-cert\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.638244 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f363b47-14ab-4719-93dd-db269dc8f132-webhook-cert\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.643604 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f363b47-14ab-4719-93dd-db269dc8f132-apiservice-cert\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.644246 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f363b47-14ab-4719-93dd-db269dc8f132-webhook-cert\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.664540 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzmkk\" (UniqueName: \"kubernetes.io/projected/9f363b47-14ab-4719-93dd-db269dc8f132-kube-api-access-zzmkk\") pod \"metallb-operator-webhook-server-6749d4b858-4656g\" (UID: \"9f363b47-14ab-4719-93dd-db269dc8f132\") " pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.694155 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.796246 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng"] Oct 09 10:40:25 crc kubenswrapper[4740]: W1009 10:40:25.806978 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97758f55_d70a_4949_b056_5673a1975dd5.slice/crio-9cf3f60da9e39eeaef280700f3714356232aefb6791726ba7bedf5a38b3b76e9 WatchSource:0}: Error finding container 9cf3f60da9e39eeaef280700f3714356232aefb6791726ba7bedf5a38b3b76e9: Status 404 returned error can't find the container with id 9cf3f60da9e39eeaef280700f3714356232aefb6791726ba7bedf5a38b3b76e9 Oct 09 10:40:25 crc kubenswrapper[4740]: I1009 10:40:25.991984 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" event={"ID":"97758f55-d70a-4949-b056-5673a1975dd5","Type":"ContainerStarted","Data":"9cf3f60da9e39eeaef280700f3714356232aefb6791726ba7bedf5a38b3b76e9"} Oct 09 10:40:26 crc kubenswrapper[4740]: I1009 10:40:26.139735 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6749d4b858-4656g"] Oct 09 10:40:26 crc kubenswrapper[4740]: W1009 10:40:26.143279 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f363b47_14ab_4719_93dd_db269dc8f132.slice/crio-b1831ab0d06be8bfe7b42f91b37fbae302b5b3b80e8b607d6fcdedd43299a1ea WatchSource:0}: Error finding container b1831ab0d06be8bfe7b42f91b37fbae302b5b3b80e8b607d6fcdedd43299a1ea: Status 404 returned error can't find the container with id b1831ab0d06be8bfe7b42f91b37fbae302b5b3b80e8b607d6fcdedd43299a1ea Oct 09 10:40:26 crc kubenswrapper[4740]: I1009 10:40:26.900552 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:26 crc kubenswrapper[4740]: I1009 10:40:26.901025 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:26 crc kubenswrapper[4740]: I1009 10:40:26.976673 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:27 crc kubenswrapper[4740]: I1009 10:40:27.006382 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" event={"ID":"9f363b47-14ab-4719-93dd-db269dc8f132","Type":"ContainerStarted","Data":"b1831ab0d06be8bfe7b42f91b37fbae302b5b3b80e8b607d6fcdedd43299a1ea"} Oct 09 10:40:27 crc kubenswrapper[4740]: I1009 10:40:27.050549 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:30 crc kubenswrapper[4740]: I1009 10:40:30.759722 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tvt5"] Oct 09 10:40:30 crc kubenswrapper[4740]: I1009 10:40:30.760161 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5tvt5" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerName="registry-server" containerID="cri-o://a52e047fe42671d561eecae8da761acae3003b8d3625761ed0fb249545ae1069" gracePeriod=2 Oct 09 10:40:31 crc kubenswrapper[4740]: I1009 10:40:31.051525 4740 generic.go:334] "Generic (PLEG): container finished" podID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerID="a52e047fe42671d561eecae8da761acae3003b8d3625761ed0fb249545ae1069" exitCode=0 Oct 09 10:40:31 crc kubenswrapper[4740]: I1009 10:40:31.051640 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tvt5" event={"ID":"1f802fba-25d3-4de5-a9ff-29b564cf5920","Type":"ContainerDied","Data":"a52e047fe42671d561eecae8da761acae3003b8d3625761ed0fb249545ae1069"} Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.364201 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.437131 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-utilities\") pod \"1f802fba-25d3-4de5-a9ff-29b564cf5920\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.437477 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvjgv\" (UniqueName: \"kubernetes.io/projected/1f802fba-25d3-4de5-a9ff-29b564cf5920-kube-api-access-pvjgv\") pod \"1f802fba-25d3-4de5-a9ff-29b564cf5920\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.437596 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-catalog-content\") pod \"1f802fba-25d3-4de5-a9ff-29b564cf5920\" (UID: \"1f802fba-25d3-4de5-a9ff-29b564cf5920\") " Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.438192 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-utilities" (OuterVolumeSpecName: "utilities") pod "1f802fba-25d3-4de5-a9ff-29b564cf5920" (UID: "1f802fba-25d3-4de5-a9ff-29b564cf5920"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.442730 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f802fba-25d3-4de5-a9ff-29b564cf5920-kube-api-access-pvjgv" (OuterVolumeSpecName: "kube-api-access-pvjgv") pod "1f802fba-25d3-4de5-a9ff-29b564cf5920" (UID: "1f802fba-25d3-4de5-a9ff-29b564cf5920"). InnerVolumeSpecName "kube-api-access-pvjgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.517966 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f802fba-25d3-4de5-a9ff-29b564cf5920" (UID: "1f802fba-25d3-4de5-a9ff-29b564cf5920"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.539296 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.539344 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvjgv\" (UniqueName: \"kubernetes.io/projected/1f802fba-25d3-4de5-a9ff-29b564cf5920-kube-api-access-pvjgv\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:32 crc kubenswrapper[4740]: I1009 10:40:32.539363 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f802fba-25d3-4de5-a9ff-29b564cf5920-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.071783 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" event={"ID":"9f363b47-14ab-4719-93dd-db269dc8f132","Type":"ContainerStarted","Data":"f417e72a373eb09402daa7333c60a5702394dcc490f4c62fa85e798d1e6b2126"} Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.073187 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" event={"ID":"97758f55-d70a-4949-b056-5673a1975dd5","Type":"ContainerStarted","Data":"2a9174340e9ccd580b9337c5d206d73ef8c60c36957a16173f5fbb24268904d1"} Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.073306 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.075416 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tvt5" event={"ID":"1f802fba-25d3-4de5-a9ff-29b564cf5920","Type":"ContainerDied","Data":"0cfe91c8e52f22c9c59976ca731a232881acd5a8c71d160bf82b1e2cdb2536d8"} Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.075450 4740 scope.go:117] "RemoveContainer" containerID="a52e047fe42671d561eecae8da761acae3003b8d3625761ed0fb249545ae1069" Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.075503 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tvt5" Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.095431 4740 scope.go:117] "RemoveContainer" containerID="6a286106af5754e6d1689a25dfcd0f85beae45484173782123bc4a9cb7b5b0b7" Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.098793 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" podStartSLOduration=2.197952822 podStartE2EDuration="8.098769861s" podCreationTimestamp="2025-10-09 10:40:25 +0000 UTC" firstStartedPulling="2025-10-09 10:40:26.147070826 +0000 UTC m=+765.109271217" lastFinishedPulling="2025-10-09 10:40:32.047887875 +0000 UTC m=+771.010088256" observedRunningTime="2025-10-09 10:40:33.095124249 +0000 UTC m=+772.057324630" watchObservedRunningTime="2025-10-09 10:40:33.098769861 +0000 UTC m=+772.060970242" Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.121335 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" podStartSLOduration=1.905629599 podStartE2EDuration="8.121318948s" podCreationTimestamp="2025-10-09 10:40:25 +0000 UTC" firstStartedPulling="2025-10-09 10:40:25.811386247 +0000 UTC m=+764.773586618" lastFinishedPulling="2025-10-09 10:40:32.027075586 +0000 UTC m=+770.989275967" observedRunningTime="2025-10-09 10:40:33.119342123 +0000 UTC m=+772.081542524" watchObservedRunningTime="2025-10-09 10:40:33.121318948 +0000 UTC m=+772.083519329" Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.129625 4740 scope.go:117] "RemoveContainer" containerID="b7d009c4ba255915501afd6719ae3540207dd054003b561c713230d9141ded4a" Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.137037 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tvt5"] Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.142094 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5tvt5"] Oct 09 10:40:33 crc kubenswrapper[4740]: I1009 10:40:33.759362 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" path="/var/lib/kubelet/pods/1f802fba-25d3-4de5-a9ff-29b564cf5920/volumes" Oct 09 10:40:34 crc kubenswrapper[4740]: I1009 10:40:34.084304 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:35 crc kubenswrapper[4740]: I1009 10:40:35.408279 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:40:35 crc kubenswrapper[4740]: I1009 10:40:35.408715 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:40:35 crc kubenswrapper[4740]: I1009 10:40:35.408813 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:40:35 crc kubenswrapper[4740]: I1009 10:40:35.409730 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"afadc9267ef0dcffe417993e78f8ce5f9baf0ee72c33f5f9de1c87bbb7818e64"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 10:40:35 crc kubenswrapper[4740]: I1009 10:40:35.409833 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://afadc9267ef0dcffe417993e78f8ce5f9baf0ee72c33f5f9de1c87bbb7818e64" gracePeriod=600 Oct 09 10:40:36 crc kubenswrapper[4740]: I1009 10:40:36.096505 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="afadc9267ef0dcffe417993e78f8ce5f9baf0ee72c33f5f9de1c87bbb7818e64" exitCode=0 Oct 09 10:40:36 crc kubenswrapper[4740]: I1009 10:40:36.096575 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"afadc9267ef0dcffe417993e78f8ce5f9baf0ee72c33f5f9de1c87bbb7818e64"} Oct 09 10:40:36 crc kubenswrapper[4740]: I1009 10:40:36.096883 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"db6bdc02b2d1bf480bf563dc4b4a9b65b436c587d39e3c847d517ccd6a5d7f1c"} Oct 09 10:40:36 crc kubenswrapper[4740]: I1009 10:40:36.096912 4740 scope.go:117] "RemoveContainer" containerID="db6d8672ec18a4cbb225b6946060995fa4e5d7aa11d060bcb6cd3c36cef0c580" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.882208 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fhrwg"] Oct 09 10:40:44 crc kubenswrapper[4740]: E1009 10:40:44.883057 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerName="extract-utilities" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.883073 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerName="extract-utilities" Oct 09 10:40:44 crc kubenswrapper[4740]: E1009 10:40:44.883090 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerName="extract-content" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.883098 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerName="extract-content" Oct 09 10:40:44 crc kubenswrapper[4740]: E1009 10:40:44.883115 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerName="registry-server" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.883123 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerName="registry-server" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.883256 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f802fba-25d3-4de5-a9ff-29b564cf5920" containerName="registry-server" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.884186 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.902743 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhrwg"] Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.910647 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc5wz\" (UniqueName: \"kubernetes.io/projected/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-kube-api-access-fc5wz\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.910698 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-catalog-content\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:44 crc kubenswrapper[4740]: I1009 10:40:44.910727 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-utilities\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.011525 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc5wz\" (UniqueName: \"kubernetes.io/projected/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-kube-api-access-fc5wz\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.011576 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-catalog-content\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.011602 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-utilities\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.012307 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-utilities\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.012402 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-catalog-content\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.034068 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc5wz\" (UniqueName: \"kubernetes.io/projected/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-kube-api-access-fc5wz\") pod \"redhat-marketplace-fhrwg\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.247663 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.648496 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhrwg"] Oct 09 10:40:45 crc kubenswrapper[4740]: I1009 10:40:45.698593 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6749d4b858-4656g" Oct 09 10:40:46 crc kubenswrapper[4740]: I1009 10:40:46.155620 4740 generic.go:334] "Generic (PLEG): container finished" podID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerID="438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853" exitCode=0 Oct 09 10:40:46 crc kubenswrapper[4740]: I1009 10:40:46.155658 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhrwg" event={"ID":"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4","Type":"ContainerDied","Data":"438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853"} Oct 09 10:40:46 crc kubenswrapper[4740]: I1009 10:40:46.155680 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhrwg" event={"ID":"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4","Type":"ContainerStarted","Data":"e21b2f3a4b5405919726ace62e78aed80cca9667892d76b8e6b7a3682b0770a9"} Oct 09 10:40:48 crc kubenswrapper[4740]: I1009 10:40:48.171467 4740 generic.go:334] "Generic (PLEG): container finished" podID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerID="6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30" exitCode=0 Oct 09 10:40:48 crc kubenswrapper[4740]: I1009 10:40:48.171542 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhrwg" event={"ID":"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4","Type":"ContainerDied","Data":"6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30"} Oct 09 10:40:49 crc kubenswrapper[4740]: I1009 10:40:49.179472 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhrwg" event={"ID":"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4","Type":"ContainerStarted","Data":"8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d"} Oct 09 10:40:49 crc kubenswrapper[4740]: I1009 10:40:49.209273 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fhrwg" podStartSLOduration=2.788830451 podStartE2EDuration="5.209257736s" podCreationTimestamp="2025-10-09 10:40:44 +0000 UTC" firstStartedPulling="2025-10-09 10:40:46.157438637 +0000 UTC m=+785.119639048" lastFinishedPulling="2025-10-09 10:40:48.577865952 +0000 UTC m=+787.540066333" observedRunningTime="2025-10-09 10:40:49.205694868 +0000 UTC m=+788.167895259" watchObservedRunningTime="2025-10-09 10:40:49.209257736 +0000 UTC m=+788.171458117" Oct 09 10:40:55 crc kubenswrapper[4740]: I1009 10:40:55.248399 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:55 crc kubenswrapper[4740]: I1009 10:40:55.249049 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:55 crc kubenswrapper[4740]: I1009 10:40:55.290105 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:56 crc kubenswrapper[4740]: I1009 10:40:56.290039 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:56 crc kubenswrapper[4740]: I1009 10:40:56.346849 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhrwg"] Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.233179 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fhrwg" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerName="registry-server" containerID="cri-o://8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d" gracePeriod=2 Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.685974 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.696253 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc5wz\" (UniqueName: \"kubernetes.io/projected/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-kube-api-access-fc5wz\") pod \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.696323 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-catalog-content\") pod \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.696387 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-utilities\") pod \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\" (UID: \"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4\") " Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.697681 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-utilities" (OuterVolumeSpecName: "utilities") pod "1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" (UID: "1cb3c0a5-6214-46cb-86b8-8f29e7714eb4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.703926 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-kube-api-access-fc5wz" (OuterVolumeSpecName: "kube-api-access-fc5wz") pod "1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" (UID: "1cb3c0a5-6214-46cb-86b8-8f29e7714eb4"). InnerVolumeSpecName "kube-api-access-fc5wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.726597 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" (UID: "1cb3c0a5-6214-46cb-86b8-8f29e7714eb4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.798610 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc5wz\" (UniqueName: \"kubernetes.io/projected/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-kube-api-access-fc5wz\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.798658 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:58 crc kubenswrapper[4740]: I1009 10:40:58.798677 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.244625 4740 generic.go:334] "Generic (PLEG): container finished" podID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerID="8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d" exitCode=0 Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.244687 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhrwg" event={"ID":"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4","Type":"ContainerDied","Data":"8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d"} Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.244724 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhrwg" event={"ID":"1cb3c0a5-6214-46cb-86b8-8f29e7714eb4","Type":"ContainerDied","Data":"e21b2f3a4b5405919726ace62e78aed80cca9667892d76b8e6b7a3682b0770a9"} Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.244775 4740 scope.go:117] "RemoveContainer" containerID="8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.244941 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhrwg" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.269306 4740 scope.go:117] "RemoveContainer" containerID="6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.287253 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhrwg"] Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.294747 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhrwg"] Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.315128 4740 scope.go:117] "RemoveContainer" containerID="438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.338570 4740 scope.go:117] "RemoveContainer" containerID="8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d" Oct 09 10:40:59 crc kubenswrapper[4740]: E1009 10:40:59.339089 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d\": container with ID starting with 8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d not found: ID does not exist" containerID="8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.339156 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d"} err="failed to get container status \"8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d\": rpc error: code = NotFound desc = could not find container \"8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d\": container with ID starting with 8be897b4b8861242e1dc01776ebeb626d768499d59b5aa2b8f1a7cc8daa44a4d not found: ID does not exist" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.339192 4740 scope.go:117] "RemoveContainer" containerID="6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30" Oct 09 10:40:59 crc kubenswrapper[4740]: E1009 10:40:59.339707 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30\": container with ID starting with 6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30 not found: ID does not exist" containerID="6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.339774 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30"} err="failed to get container status \"6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30\": rpc error: code = NotFound desc = could not find container \"6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30\": container with ID starting with 6bea45452672ff04f4c5c51a6c9bce4a1716e13bbfcbc4ce6dc1373870234d30 not found: ID does not exist" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.339804 4740 scope.go:117] "RemoveContainer" containerID="438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853" Oct 09 10:40:59 crc kubenswrapper[4740]: E1009 10:40:59.340365 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853\": container with ID starting with 438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853 not found: ID does not exist" containerID="438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.340431 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853"} err="failed to get container status \"438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853\": rpc error: code = NotFound desc = could not find container \"438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853\": container with ID starting with 438ae30c36918ce202ebc7b08e6a2167b6b03c699185d888e106cd568d481853 not found: ID does not exist" Oct 09 10:40:59 crc kubenswrapper[4740]: I1009 10:40:59.766001 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" path="/var/lib/kubelet/pods/1cb3c0a5-6214-46cb-86b8-8f29e7714eb4/volumes" Oct 09 10:41:05 crc kubenswrapper[4740]: I1009 10:41:05.334632 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7698d6f4f4-hrnng" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.168368 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs"] Oct 09 10:41:06 crc kubenswrapper[4740]: E1009 10:41:06.168618 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerName="extract-utilities" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.168632 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerName="extract-utilities" Oct 09 10:41:06 crc kubenswrapper[4740]: E1009 10:41:06.168652 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerName="registry-server" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.168660 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerName="registry-server" Oct 09 10:41:06 crc kubenswrapper[4740]: E1009 10:41:06.168682 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerName="extract-content" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.168690 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerName="extract-content" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.168823 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb3c0a5-6214-46cb-86b8-8f29e7714eb4" containerName="registry-server" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.169276 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.173256 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.173368 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vpcvp" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.195958 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs"] Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.241543 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-svscr"] Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.243787 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.248502 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.248624 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.315434 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01117083-3a07-4afc-b678-11b52fd9edea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-76cjs\" (UID: \"01117083-3a07-4afc-b678-11b52fd9edea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.315477 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5rg\" (UniqueName: \"kubernetes.io/projected/01117083-3a07-4afc-b678-11b52fd9edea-kube-api-access-xb5rg\") pod \"frr-k8s-webhook-server-64bf5d555-76cjs\" (UID: \"01117083-3a07-4afc-b678-11b52fd9edea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.316745 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-8dwvb"] Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.317586 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.322095 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.333506 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qzfv2"] Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.334978 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.342593 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.342847 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.343537 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-h79k8" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.347483 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.351848 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-8dwvb"] Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417154 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbktx\" (UniqueName: \"kubernetes.io/projected/66ca384b-ba80-4854-a999-bb78c9db0e6b-kube-api-access-xbktx\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417202 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-startup\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417231 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01117083-3a07-4afc-b678-11b52fd9edea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-76cjs\" (UID: \"01117083-3a07-4afc-b678-11b52fd9edea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417371 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5rg\" (UniqueName: \"kubernetes.io/projected/01117083-3a07-4afc-b678-11b52fd9edea-kube-api-access-xb5rg\") pod \"frr-k8s-webhook-server-64bf5d555-76cjs\" (UID: \"01117083-3a07-4afc-b678-11b52fd9edea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417399 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-conf\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417421 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-sockets\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417455 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-reloader\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417492 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66ca384b-ba80-4854-a999-bb78c9db0e6b-metrics-certs\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.417521 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-metrics\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.423383 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01117083-3a07-4afc-b678-11b52fd9edea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-76cjs\" (UID: \"01117083-3a07-4afc-b678-11b52fd9edea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.432189 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5rg\" (UniqueName: \"kubernetes.io/projected/01117083-3a07-4afc-b678-11b52fd9edea-kube-api-access-xb5rg\") pod \"frr-k8s-webhook-server-64bf5d555-76cjs\" (UID: \"01117083-3a07-4afc-b678-11b52fd9edea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.484834 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518614 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jljn5\" (UniqueName: \"kubernetes.io/projected/e58ad5c0-164a-4ed4-b665-44068078198c-kube-api-access-jljn5\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518658 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbktx\" (UniqueName: \"kubernetes.io/projected/66ca384b-ba80-4854-a999-bb78c9db0e6b-kube-api-access-xbktx\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518689 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-startup\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518719 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3bb4221a-8b49-4183-a53b-6f81deafb446-metallb-excludel2\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518736 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-conf\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518764 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58ad5c0-164a-4ed4-b665-44068078198c-metrics-certs\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518785 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518800 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-sockets\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518814 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24255\" (UniqueName: \"kubernetes.io/projected/3bb4221a-8b49-4183-a53b-6f81deafb446-kube-api-access-24255\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518842 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-metrics-certs\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518857 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-reloader\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518887 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66ca384b-ba80-4854-a999-bb78c9db0e6b-metrics-certs\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518908 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e58ad5c0-164a-4ed4-b665-44068078198c-cert\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.518928 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-metrics\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.519352 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-metrics\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.519379 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-sockets\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.519689 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-conf\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.519716 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66ca384b-ba80-4854-a999-bb78c9db0e6b-reloader\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.520135 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66ca384b-ba80-4854-a999-bb78c9db0e6b-frr-startup\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.524104 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66ca384b-ba80-4854-a999-bb78c9db0e6b-metrics-certs\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.539254 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbktx\" (UniqueName: \"kubernetes.io/projected/66ca384b-ba80-4854-a999-bb78c9db0e6b-kube-api-access-xbktx\") pod \"frr-k8s-svscr\" (UID: \"66ca384b-ba80-4854-a999-bb78c9db0e6b\") " pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.555553 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.623515 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-metrics-certs\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.623582 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e58ad5c0-164a-4ed4-b665-44068078198c-cert\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.623610 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jljn5\" (UniqueName: \"kubernetes.io/projected/e58ad5c0-164a-4ed4-b665-44068078198c-kube-api-access-jljn5\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.623660 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3bb4221a-8b49-4183-a53b-6f81deafb446-metallb-excludel2\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.623675 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58ad5c0-164a-4ed4-b665-44068078198c-metrics-certs\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.623695 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: E1009 10:41:06.623702 4740 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 09 10:41:06 crc kubenswrapper[4740]: E1009 10:41:06.623787 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-metrics-certs podName:3bb4221a-8b49-4183-a53b-6f81deafb446 nodeName:}" failed. No retries permitted until 2025-10-09 10:41:07.123766531 +0000 UTC m=+806.085966912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-metrics-certs") pod "speaker-qzfv2" (UID: "3bb4221a-8b49-4183-a53b-6f81deafb446") : secret "speaker-certs-secret" not found Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.623710 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24255\" (UniqueName: \"kubernetes.io/projected/3bb4221a-8b49-4183-a53b-6f81deafb446-kube-api-access-24255\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: E1009 10:41:06.623913 4740 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.625340 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3bb4221a-8b49-4183-a53b-6f81deafb446-metallb-excludel2\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.625684 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 09 10:41:06 crc kubenswrapper[4740]: E1009 10:41:06.626825 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist podName:3bb4221a-8b49-4183-a53b-6f81deafb446 nodeName:}" failed. No retries permitted until 2025-10-09 10:41:07.126794405 +0000 UTC m=+806.088994866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist") pod "speaker-qzfv2" (UID: "3bb4221a-8b49-4183-a53b-6f81deafb446") : secret "metallb-memberlist" not found Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.630053 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58ad5c0-164a-4ed4-b665-44068078198c-metrics-certs\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.642175 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24255\" (UniqueName: \"kubernetes.io/projected/3bb4221a-8b49-4183-a53b-6f81deafb446-kube-api-access-24255\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.643442 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jljn5\" (UniqueName: \"kubernetes.io/projected/e58ad5c0-164a-4ed4-b665-44068078198c-kube-api-access-jljn5\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.646781 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e58ad5c0-164a-4ed4-b665-44068078198c-cert\") pod \"controller-68d546b9d8-8dwvb\" (UID: \"e58ad5c0-164a-4ed4-b665-44068078198c\") " pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.926537 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs"] Oct 09 10:41:06 crc kubenswrapper[4740]: I1009 10:41:06.929898 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:06 crc kubenswrapper[4740]: W1009 10:41:06.931281 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01117083_3a07_4afc_b678_11b52fd9edea.slice/crio-ab7e96c6dd36299e0438eed8c98a1abb99248f2e4f93e0366680918dc5493303 WatchSource:0}: Error finding container ab7e96c6dd36299e0438eed8c98a1abb99248f2e4f93e0366680918dc5493303: Status 404 returned error can't find the container with id ab7e96c6dd36299e0438eed8c98a1abb99248f2e4f93e0366680918dc5493303 Oct 09 10:41:07 crc kubenswrapper[4740]: I1009 10:41:07.130280 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:07 crc kubenswrapper[4740]: I1009 10:41:07.130359 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-metrics-certs\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:07 crc kubenswrapper[4740]: E1009 10:41:07.130408 4740 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 10:41:07 crc kubenswrapper[4740]: E1009 10:41:07.130460 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist podName:3bb4221a-8b49-4183-a53b-6f81deafb446 nodeName:}" failed. No retries permitted until 2025-10-09 10:41:08.130446272 +0000 UTC m=+807.092646653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist") pod "speaker-qzfv2" (UID: "3bb4221a-8b49-4183-a53b-6f81deafb446") : secret "metallb-memberlist" not found Oct 09 10:41:07 crc kubenswrapper[4740]: I1009 10:41:07.133461 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-metrics-certs\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:07 crc kubenswrapper[4740]: I1009 10:41:07.299448 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" event={"ID":"01117083-3a07-4afc-b678-11b52fd9edea","Type":"ContainerStarted","Data":"ab7e96c6dd36299e0438eed8c98a1abb99248f2e4f93e0366680918dc5493303"} Oct 09 10:41:07 crc kubenswrapper[4740]: I1009 10:41:07.300570 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerStarted","Data":"195b27d1aa89624ca09fa83d4a57f2180a1764f5c67e0a3f41cdd9414b356ec1"} Oct 09 10:41:07 crc kubenswrapper[4740]: I1009 10:41:07.328421 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-8dwvb"] Oct 09 10:41:07 crc kubenswrapper[4740]: W1009 10:41:07.334381 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58ad5c0_164a_4ed4_b665_44068078198c.slice/crio-5baf412d9623f59324b678485219f654fc42537732b8c85a777f8685d766aac2 WatchSource:0}: Error finding container 5baf412d9623f59324b678485219f654fc42537732b8c85a777f8685d766aac2: Status 404 returned error can't find the container with id 5baf412d9623f59324b678485219f654fc42537732b8c85a777f8685d766aac2 Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.143413 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.148984 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3bb4221a-8b49-4183-a53b-6f81deafb446-memberlist\") pod \"speaker-qzfv2\" (UID: \"3bb4221a-8b49-4183-a53b-6f81deafb446\") " pod="metallb-system/speaker-qzfv2" Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.153589 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qzfv2" Oct 09 10:41:08 crc kubenswrapper[4740]: W1009 10:41:08.177473 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb4221a_8b49_4183_a53b_6f81deafb446.slice/crio-5bc982373deb9267406e3a56714fd242ffcb08567b674c84b626f4536b0879b8 WatchSource:0}: Error finding container 5bc982373deb9267406e3a56714fd242ffcb08567b674c84b626f4536b0879b8: Status 404 returned error can't find the container with id 5bc982373deb9267406e3a56714fd242ffcb08567b674c84b626f4536b0879b8 Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.308074 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8dwvb" event={"ID":"e58ad5c0-164a-4ed4-b665-44068078198c","Type":"ContainerStarted","Data":"8a0fd47486f15c908277f9cc15692f5112cdfe8328371af6603b4f6d8766ec9b"} Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.308136 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8dwvb" event={"ID":"e58ad5c0-164a-4ed4-b665-44068078198c","Type":"ContainerStarted","Data":"fb74be9f551e85dd31f0f187e40ff6ef0899bb655523eca3ff48d0f0c8290ac4"} Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.308156 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8dwvb" event={"ID":"e58ad5c0-164a-4ed4-b665-44068078198c","Type":"ContainerStarted","Data":"5baf412d9623f59324b678485219f654fc42537732b8c85a777f8685d766aac2"} Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.308294 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.309277 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qzfv2" event={"ID":"3bb4221a-8b49-4183-a53b-6f81deafb446","Type":"ContainerStarted","Data":"5bc982373deb9267406e3a56714fd242ffcb08567b674c84b626f4536b0879b8"} Oct 09 10:41:08 crc kubenswrapper[4740]: I1009 10:41:08.327151 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-8dwvb" podStartSLOduration=2.3271246740000002 podStartE2EDuration="2.327124674s" podCreationTimestamp="2025-10-09 10:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:41:08.325306674 +0000 UTC m=+807.287507075" watchObservedRunningTime="2025-10-09 10:41:08.327124674 +0000 UTC m=+807.289325105" Oct 09 10:41:09 crc kubenswrapper[4740]: I1009 10:41:09.331670 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qzfv2" event={"ID":"3bb4221a-8b49-4183-a53b-6f81deafb446","Type":"ContainerStarted","Data":"a59410a5c37f220e27948c8c9d530a045bc5a6916651bf25554a8de84a6e72f2"} Oct 09 10:41:09 crc kubenswrapper[4740]: I1009 10:41:09.332045 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qzfv2" event={"ID":"3bb4221a-8b49-4183-a53b-6f81deafb446","Type":"ContainerStarted","Data":"e3aa43c968307c62ac3397f939581432265c70b427c0808ae8d3b7a7e47c5707"} Oct 09 10:41:09 crc kubenswrapper[4740]: I1009 10:41:09.332064 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qzfv2" Oct 09 10:41:09 crc kubenswrapper[4740]: I1009 10:41:09.352472 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qzfv2" podStartSLOduration=3.352446738 podStartE2EDuration="3.352446738s" podCreationTimestamp="2025-10-09 10:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:41:09.347547952 +0000 UTC m=+808.309748343" watchObservedRunningTime="2025-10-09 10:41:09.352446738 +0000 UTC m=+808.314647119" Oct 09 10:41:15 crc kubenswrapper[4740]: I1009 10:41:15.368368 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" event={"ID":"01117083-3a07-4afc-b678-11b52fd9edea","Type":"ContainerStarted","Data":"acfd5001d589206a408c658993d62d347f232c4a77112fc0e1031012f790880b"} Oct 09 10:41:15 crc kubenswrapper[4740]: I1009 10:41:15.368969 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:15 crc kubenswrapper[4740]: I1009 10:41:15.370278 4740 generic.go:334] "Generic (PLEG): container finished" podID="66ca384b-ba80-4854-a999-bb78c9db0e6b" containerID="61b54798f0741f660f8b623598ebb996e6bf820e7d4f0eccc28e60cd494b2a1b" exitCode=0 Oct 09 10:41:15 crc kubenswrapper[4740]: I1009 10:41:15.370298 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerDied","Data":"61b54798f0741f660f8b623598ebb996e6bf820e7d4f0eccc28e60cd494b2a1b"} Oct 09 10:41:15 crc kubenswrapper[4740]: I1009 10:41:15.393315 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" podStartSLOduration=1.836496949 podStartE2EDuration="9.393287413s" podCreationTimestamp="2025-10-09 10:41:06 +0000 UTC" firstStartedPulling="2025-10-09 10:41:06.936424564 +0000 UTC m=+805.898624955" lastFinishedPulling="2025-10-09 10:41:14.493215038 +0000 UTC m=+813.455415419" observedRunningTime="2025-10-09 10:41:15.384704925 +0000 UTC m=+814.346905316" watchObservedRunningTime="2025-10-09 10:41:15.393287413 +0000 UTC m=+814.355487834" Oct 09 10:41:16 crc kubenswrapper[4740]: I1009 10:41:16.381998 4740 generic.go:334] "Generic (PLEG): container finished" podID="66ca384b-ba80-4854-a999-bb78c9db0e6b" containerID="cb755dccdd6818c398c425e52ca4cf49421ed3c8b3bd2ecc0525c95761fa746e" exitCode=0 Oct 09 10:41:16 crc kubenswrapper[4740]: I1009 10:41:16.382128 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerDied","Data":"cb755dccdd6818c398c425e52ca4cf49421ed3c8b3bd2ecc0525c95761fa746e"} Oct 09 10:41:17 crc kubenswrapper[4740]: I1009 10:41:17.392436 4740 generic.go:334] "Generic (PLEG): container finished" podID="66ca384b-ba80-4854-a999-bb78c9db0e6b" containerID="e166b50a3c9f58104dac161a4af2a249e455d45cdcf49ab7d59da20864b91b5c" exitCode=0 Oct 09 10:41:17 crc kubenswrapper[4740]: I1009 10:41:17.392527 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerDied","Data":"e166b50a3c9f58104dac161a4af2a249e455d45cdcf49ab7d59da20864b91b5c"} Oct 09 10:41:18 crc kubenswrapper[4740]: I1009 10:41:18.162898 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qzfv2" Oct 09 10:41:18 crc kubenswrapper[4740]: I1009 10:41:18.404744 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerStarted","Data":"7c8dded62f25ee9436ef6a74831cb68856adabdd4e0746f6ae20caf664389bd7"} Oct 09 10:41:18 crc kubenswrapper[4740]: I1009 10:41:18.404830 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerStarted","Data":"fb8baaa82fbc62f2b4b0b25cb5c399aff2c880716605041624e2b9b91b004546"} Oct 09 10:41:18 crc kubenswrapper[4740]: I1009 10:41:18.404843 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerStarted","Data":"4c93bad867badf84166036af709322dd0395a8f3076f9391dc8fe84a8096aee3"} Oct 09 10:41:18 crc kubenswrapper[4740]: I1009 10:41:18.404853 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerStarted","Data":"8fe4fdc6fb2841bad2292a57e70bb446b972b4a847784ba5d9529bc4d34c5d80"} Oct 09 10:41:18 crc kubenswrapper[4740]: I1009 10:41:18.404863 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerStarted","Data":"8aa4fd23adc3515f24980c7589dfea67e93e2b051af6652537a363ddc4fbb26b"} Oct 09 10:41:19 crc kubenswrapper[4740]: I1009 10:41:19.419018 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-svscr" event={"ID":"66ca384b-ba80-4854-a999-bb78c9db0e6b","Type":"ContainerStarted","Data":"a2cb89e3be46dccd31e99152c1cf51851c75abf92a33a7d080609a7ed3b49d3f"} Oct 09 10:41:19 crc kubenswrapper[4740]: I1009 10:41:19.420214 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.137705 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-svscr" podStartSLOduration=7.342305107 podStartE2EDuration="15.137681616s" podCreationTimestamp="2025-10-09 10:41:06 +0000 UTC" firstStartedPulling="2025-10-09 10:41:06.670638053 +0000 UTC m=+805.632838434" lastFinishedPulling="2025-10-09 10:41:14.466014522 +0000 UTC m=+813.428214943" observedRunningTime="2025-10-09 10:41:19.44715125 +0000 UTC m=+818.409351631" watchObservedRunningTime="2025-10-09 10:41:21.137681616 +0000 UTC m=+820.099882007" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.143478 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7glx9"] Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.144672 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7glx9" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.146918 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-tsfq2" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.146978 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.164407 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.183498 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7glx9"] Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.342441 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmwz2\" (UniqueName: \"kubernetes.io/projected/7a2f31c2-29b9-4126-ad6e-1500e08b6959-kube-api-access-kmwz2\") pod \"openstack-operator-index-7glx9\" (UID: \"7a2f31c2-29b9-4126-ad6e-1500e08b6959\") " pod="openstack-operators/openstack-operator-index-7glx9" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.443901 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmwz2\" (UniqueName: \"kubernetes.io/projected/7a2f31c2-29b9-4126-ad6e-1500e08b6959-kube-api-access-kmwz2\") pod \"openstack-operator-index-7glx9\" (UID: \"7a2f31c2-29b9-4126-ad6e-1500e08b6959\") " pod="openstack-operators/openstack-operator-index-7glx9" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.470724 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmwz2\" (UniqueName: \"kubernetes.io/projected/7a2f31c2-29b9-4126-ad6e-1500e08b6959-kube-api-access-kmwz2\") pod \"openstack-operator-index-7glx9\" (UID: \"7a2f31c2-29b9-4126-ad6e-1500e08b6959\") " pod="openstack-operators/openstack-operator-index-7glx9" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.475193 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7glx9" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.556481 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.622613 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:21 crc kubenswrapper[4740]: I1009 10:41:21.900632 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7glx9"] Oct 09 10:41:22 crc kubenswrapper[4740]: I1009 10:41:22.438081 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7glx9" event={"ID":"7a2f31c2-29b9-4126-ad6e-1500e08b6959","Type":"ContainerStarted","Data":"13865e7f4379cd1ea8909893d4d56552c700642d92249f853f7f0616859eed94"} Oct 09 10:41:24 crc kubenswrapper[4740]: I1009 10:41:24.517501 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7glx9"] Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.121910 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gvdpg"] Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.123043 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.139644 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gvdpg"] Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.305100 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzrw6\" (UniqueName: \"kubernetes.io/projected/aad20335-936b-4ec4-aced-424bf31edf74-kube-api-access-vzrw6\") pod \"openstack-operator-index-gvdpg\" (UID: \"aad20335-936b-4ec4-aced-424bf31edf74\") " pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.406872 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzrw6\" (UniqueName: \"kubernetes.io/projected/aad20335-936b-4ec4-aced-424bf31edf74-kube-api-access-vzrw6\") pod \"openstack-operator-index-gvdpg\" (UID: \"aad20335-936b-4ec4-aced-424bf31edf74\") " pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.426532 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzrw6\" (UniqueName: \"kubernetes.io/projected/aad20335-936b-4ec4-aced-424bf31edf74-kube-api-access-vzrw6\") pod \"openstack-operator-index-gvdpg\" (UID: \"aad20335-936b-4ec4-aced-424bf31edf74\") " pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.439432 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.456863 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7glx9" event={"ID":"7a2f31c2-29b9-4126-ad6e-1500e08b6959","Type":"ContainerStarted","Data":"18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5"} Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.457063 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7glx9" podUID="7a2f31c2-29b9-4126-ad6e-1500e08b6959" containerName="registry-server" containerID="cri-o://18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5" gracePeriod=2 Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.481609 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7glx9" podStartSLOduration=1.812730303 podStartE2EDuration="4.481578817s" podCreationTimestamp="2025-10-09 10:41:21 +0000 UTC" firstStartedPulling="2025-10-09 10:41:21.90913226 +0000 UTC m=+820.871332641" lastFinishedPulling="2025-10-09 10:41:24.577980774 +0000 UTC m=+823.540181155" observedRunningTime="2025-10-09 10:41:25.476068324 +0000 UTC m=+824.438268715" watchObservedRunningTime="2025-10-09 10:41:25.481578817 +0000 UTC m=+824.443779238" Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.884390 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7glx9" Oct 09 10:41:25 crc kubenswrapper[4740]: I1009 10:41:25.900302 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gvdpg"] Oct 09 10:41:25 crc kubenswrapper[4740]: W1009 10:41:25.905323 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaad20335_936b_4ec4_aced_424bf31edf74.slice/crio-31473ad6c3c31656e502e527ab6ebecbaf9d3b1986047a758d745c60d18eaf9a WatchSource:0}: Error finding container 31473ad6c3c31656e502e527ab6ebecbaf9d3b1986047a758d745c60d18eaf9a: Status 404 returned error can't find the container with id 31473ad6c3c31656e502e527ab6ebecbaf9d3b1986047a758d745c60d18eaf9a Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.013485 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmwz2\" (UniqueName: \"kubernetes.io/projected/7a2f31c2-29b9-4126-ad6e-1500e08b6959-kube-api-access-kmwz2\") pod \"7a2f31c2-29b9-4126-ad6e-1500e08b6959\" (UID: \"7a2f31c2-29b9-4126-ad6e-1500e08b6959\") " Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.017812 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2f31c2-29b9-4126-ad6e-1500e08b6959-kube-api-access-kmwz2" (OuterVolumeSpecName: "kube-api-access-kmwz2") pod "7a2f31c2-29b9-4126-ad6e-1500e08b6959" (UID: "7a2f31c2-29b9-4126-ad6e-1500e08b6959"). InnerVolumeSpecName "kube-api-access-kmwz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.114972 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmwz2\" (UniqueName: \"kubernetes.io/projected/7a2f31c2-29b9-4126-ad6e-1500e08b6959-kube-api-access-kmwz2\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.466788 4740 generic.go:334] "Generic (PLEG): container finished" podID="7a2f31c2-29b9-4126-ad6e-1500e08b6959" containerID="18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5" exitCode=0 Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.466852 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7glx9" Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.466898 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7glx9" event={"ID":"7a2f31c2-29b9-4126-ad6e-1500e08b6959","Type":"ContainerDied","Data":"18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5"} Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.466951 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7glx9" event={"ID":"7a2f31c2-29b9-4126-ad6e-1500e08b6959","Type":"ContainerDied","Data":"13865e7f4379cd1ea8909893d4d56552c700642d92249f853f7f0616859eed94"} Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.466982 4740 scope.go:117] "RemoveContainer" containerID="18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5" Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.470702 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gvdpg" event={"ID":"aad20335-936b-4ec4-aced-424bf31edf74","Type":"ContainerStarted","Data":"10a9fe9c9d0cb92961b703a84dd5d6e7e859827784bcd50cd25b3c60ac6bb28e"} Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.470784 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gvdpg" event={"ID":"aad20335-936b-4ec4-aced-424bf31edf74","Type":"ContainerStarted","Data":"31473ad6c3c31656e502e527ab6ebecbaf9d3b1986047a758d745c60d18eaf9a"} Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.492526 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gvdpg" podStartSLOduration=1.439423577 podStartE2EDuration="1.492495831s" podCreationTimestamp="2025-10-09 10:41:25 +0000 UTC" firstStartedPulling="2025-10-09 10:41:25.914955712 +0000 UTC m=+824.877156093" lastFinishedPulling="2025-10-09 10:41:25.968027976 +0000 UTC m=+824.930228347" observedRunningTime="2025-10-09 10:41:26.489304892 +0000 UTC m=+825.451505333" watchObservedRunningTime="2025-10-09 10:41:26.492495831 +0000 UTC m=+825.454696292" Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.502279 4740 scope.go:117] "RemoveContainer" containerID="18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5" Oct 09 10:41:26 crc kubenswrapper[4740]: E1009 10:41:26.503684 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5\": container with ID starting with 18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5 not found: ID does not exist" containerID="18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5" Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.503803 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5"} err="failed to get container status \"18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5\": rpc error: code = NotFound desc = could not find container \"18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5\": container with ID starting with 18bee2f5ab6f5df2723fccac148f53b3b9575f300a56a698fb616cff4fb462b5 not found: ID does not exist" Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.504213 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-76cjs" Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.509961 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7glx9"] Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.517628 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7glx9"] Oct 09 10:41:26 crc kubenswrapper[4740]: I1009 10:41:26.939527 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-8dwvb" Oct 09 10:41:27 crc kubenswrapper[4740]: I1009 10:41:27.773142 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2f31c2-29b9-4126-ad6e-1500e08b6959" path="/var/lib/kubelet/pods/7a2f31c2-29b9-4126-ad6e-1500e08b6959/volumes" Oct 09 10:41:35 crc kubenswrapper[4740]: I1009 10:41:35.440606 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:35 crc kubenswrapper[4740]: I1009 10:41:35.441480 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:35 crc kubenswrapper[4740]: I1009 10:41:35.482537 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:35 crc kubenswrapper[4740]: I1009 10:41:35.568680 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-gvdpg" Oct 09 10:41:36 crc kubenswrapper[4740]: I1009 10:41:36.560179 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-svscr" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.732992 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hcpc4"] Oct 09 10:41:37 crc kubenswrapper[4740]: E1009 10:41:37.733350 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2f31c2-29b9-4126-ad6e-1500e08b6959" containerName="registry-server" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.733372 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2f31c2-29b9-4126-ad6e-1500e08b6959" containerName="registry-server" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.733607 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2f31c2-29b9-4126-ad6e-1500e08b6959" containerName="registry-server" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.735034 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.750003 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hcpc4"] Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.893853 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrz6\" (UniqueName: \"kubernetes.io/projected/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-kube-api-access-lbrz6\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.893964 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-catalog-content\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.894060 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-utilities\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.994965 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-utilities\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.995074 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrz6\" (UniqueName: \"kubernetes.io/projected/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-kube-api-access-lbrz6\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.995187 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-catalog-content\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.995851 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-utilities\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:37 crc kubenswrapper[4740]: I1009 10:41:37.995970 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-catalog-content\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:38 crc kubenswrapper[4740]: I1009 10:41:38.023783 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrz6\" (UniqueName: \"kubernetes.io/projected/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-kube-api-access-lbrz6\") pod \"certified-operators-hcpc4\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:38 crc kubenswrapper[4740]: I1009 10:41:38.068146 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:38 crc kubenswrapper[4740]: I1009 10:41:38.504400 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hcpc4"] Oct 09 10:41:38 crc kubenswrapper[4740]: I1009 10:41:38.558110 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcpc4" event={"ID":"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12","Type":"ContainerStarted","Data":"3784042afe1ca55689e41338f66d14c93f4f8913f62da32f80695c77ccab39db"} Oct 09 10:41:39 crc kubenswrapper[4740]: I1009 10:41:39.408732 4740 patch_prober.go:28] interesting pod/dns-default-xhb6x container/dns namespace/openshift-dns: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=kubernetes Oct 09 10:41:39 crc kubenswrapper[4740]: I1009 10:41:39.411433 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-dns/dns-default-xhb6x" podUID="b6af8fb0-93bd-444e-9d7a-1e6a1a42ef7e" containerName="dns" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 09 10:41:39 crc kubenswrapper[4740]: I1009 10:41:39.567271 4740 generic.go:334] "Generic (PLEG): container finished" podID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerID="c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea" exitCode=0 Oct 09 10:41:39 crc kubenswrapper[4740]: I1009 10:41:39.567328 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcpc4" event={"ID":"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12","Type":"ContainerDied","Data":"c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea"} Oct 09 10:41:40 crc kubenswrapper[4740]: I1009 10:41:40.576797 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcpc4" event={"ID":"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12","Type":"ContainerStarted","Data":"4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73"} Oct 09 10:41:41 crc kubenswrapper[4740]: I1009 10:41:41.585834 4740 generic.go:334] "Generic (PLEG): container finished" podID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerID="4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73" exitCode=0 Oct 09 10:41:41 crc kubenswrapper[4740]: I1009 10:41:41.585884 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcpc4" event={"ID":"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12","Type":"ContainerDied","Data":"4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73"} Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.530009 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wdj4w"] Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.532166 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.543376 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wdj4w"] Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.600323 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcpc4" event={"ID":"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12","Type":"ContainerStarted","Data":"d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f"} Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.621498 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hcpc4" podStartSLOduration=3.118445667 podStartE2EDuration="5.621481046s" podCreationTimestamp="2025-10-09 10:41:37 +0000 UTC" firstStartedPulling="2025-10-09 10:41:39.569850612 +0000 UTC m=+838.532051033" lastFinishedPulling="2025-10-09 10:41:42.072886001 +0000 UTC m=+841.035086412" observedRunningTime="2025-10-09 10:41:42.617701191 +0000 UTC m=+841.579901582" watchObservedRunningTime="2025-10-09 10:41:42.621481046 +0000 UTC m=+841.583681437" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.660774 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ccw\" (UniqueName: \"kubernetes.io/projected/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-kube-api-access-w2ccw\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.660841 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-catalog-content\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.660884 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-utilities\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.761976 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ccw\" (UniqueName: \"kubernetes.io/projected/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-kube-api-access-w2ccw\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.762280 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-catalog-content\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.762411 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-utilities\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.762848 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-catalog-content\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.762898 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-utilities\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.789309 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ccw\" (UniqueName: \"kubernetes.io/projected/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-kube-api-access-w2ccw\") pod \"community-operators-wdj4w\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:42 crc kubenswrapper[4740]: I1009 10:41:42.850526 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.136226 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wdj4w"] Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.197192 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44"] Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.198651 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.203038 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-dvdx9" Oct 09 10:41:43 crc kubenswrapper[4740]: W1009 10:41:43.212904 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bf1fbb5_80bc_4b41_86bc_7fa73c2d30ab.slice/crio-f34d003c422d1b9a7a7996d0dac4b540effb30e3ac43b88acdf5d560e8a7d6e8 WatchSource:0}: Error finding container f34d003c422d1b9a7a7996d0dac4b540effb30e3ac43b88acdf5d560e8a7d6e8: Status 404 returned error can't find the container with id f34d003c422d1b9a7a7996d0dac4b540effb30e3ac43b88acdf5d560e8a7d6e8 Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.217449 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44"] Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.368854 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54mz5\" (UniqueName: \"kubernetes.io/projected/d1d10dcf-922d-4d14-ac25-0b8482757670-kube-api-access-54mz5\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.368972 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-util\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.369034 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-bundle\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.470374 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-bundle\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.470654 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54mz5\" (UniqueName: \"kubernetes.io/projected/d1d10dcf-922d-4d14-ac25-0b8482757670-kube-api-access-54mz5\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.470766 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-util\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.470842 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-bundle\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.471239 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-util\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.492128 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54mz5\" (UniqueName: \"kubernetes.io/projected/d1d10dcf-922d-4d14-ac25-0b8482757670-kube-api-access-54mz5\") pod \"d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.530487 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.612524 4740 generic.go:334] "Generic (PLEG): container finished" podID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerID="f3904b9e894d320797b29f98121b44b9c246b90cca868429090f59d9bad35e40" exitCode=0 Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.613360 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdj4w" event={"ID":"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab","Type":"ContainerDied","Data":"f3904b9e894d320797b29f98121b44b9c246b90cca868429090f59d9bad35e40"} Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.613385 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdj4w" event={"ID":"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab","Type":"ContainerStarted","Data":"f34d003c422d1b9a7a7996d0dac4b540effb30e3ac43b88acdf5d560e8a7d6e8"} Oct 09 10:41:43 crc kubenswrapper[4740]: I1009 10:41:43.951714 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44"] Oct 09 10:41:43 crc kubenswrapper[4740]: W1009 10:41:43.956667 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d10dcf_922d_4d14_ac25_0b8482757670.slice/crio-684cd3f7f065854bb60f6c7ce6df3ccf49ac23707e4640f2206915a30aa4a661 WatchSource:0}: Error finding container 684cd3f7f065854bb60f6c7ce6df3ccf49ac23707e4640f2206915a30aa4a661: Status 404 returned error can't find the container with id 684cd3f7f065854bb60f6c7ce6df3ccf49ac23707e4640f2206915a30aa4a661 Oct 09 10:41:44 crc kubenswrapper[4740]: I1009 10:41:44.619330 4740 generic.go:334] "Generic (PLEG): container finished" podID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerID="a2e04582c58800181882ed7dab8668ce79526ffa868ae05e30571f5719ace9d8" exitCode=0 Oct 09 10:41:44 crc kubenswrapper[4740]: I1009 10:41:44.621123 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" event={"ID":"d1d10dcf-922d-4d14-ac25-0b8482757670","Type":"ContainerDied","Data":"a2e04582c58800181882ed7dab8668ce79526ffa868ae05e30571f5719ace9d8"} Oct 09 10:41:44 crc kubenswrapper[4740]: I1009 10:41:44.621183 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" event={"ID":"d1d10dcf-922d-4d14-ac25-0b8482757670","Type":"ContainerStarted","Data":"684cd3f7f065854bb60f6c7ce6df3ccf49ac23707e4640f2206915a30aa4a661"} Oct 09 10:41:44 crc kubenswrapper[4740]: I1009 10:41:44.624996 4740 generic.go:334] "Generic (PLEG): container finished" podID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerID="e78828d7a2b23eae54b4feb54c2cfcfd23d5c007e1fd384e7b89ed95e525279a" exitCode=0 Oct 09 10:41:44 crc kubenswrapper[4740]: I1009 10:41:44.625073 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdj4w" event={"ID":"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab","Type":"ContainerDied","Data":"e78828d7a2b23eae54b4feb54c2cfcfd23d5c007e1fd384e7b89ed95e525279a"} Oct 09 10:41:45 crc kubenswrapper[4740]: I1009 10:41:45.632361 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdj4w" event={"ID":"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab","Type":"ContainerStarted","Data":"5f2690fbf5684757d40a566cf2d4dcf5e5292d5e0f1f5f697d1226cffe961e5d"} Oct 09 10:41:45 crc kubenswrapper[4740]: I1009 10:41:45.634320 4740 generic.go:334] "Generic (PLEG): container finished" podID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerID="4b3a2d1db587b9c2236146947305f2e788b6bc93899a635afbf992827ed84d9f" exitCode=0 Oct 09 10:41:45 crc kubenswrapper[4740]: I1009 10:41:45.634359 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" event={"ID":"d1d10dcf-922d-4d14-ac25-0b8482757670","Type":"ContainerDied","Data":"4b3a2d1db587b9c2236146947305f2e788b6bc93899a635afbf992827ed84d9f"} Oct 09 10:41:45 crc kubenswrapper[4740]: I1009 10:41:45.713333 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wdj4w" podStartSLOduration=2.322976726 podStartE2EDuration="3.713313727s" podCreationTimestamp="2025-10-09 10:41:42 +0000 UTC" firstStartedPulling="2025-10-09 10:41:43.613978468 +0000 UTC m=+842.576178869" lastFinishedPulling="2025-10-09 10:41:45.004315489 +0000 UTC m=+843.966515870" observedRunningTime="2025-10-09 10:41:45.689033703 +0000 UTC m=+844.651234084" watchObservedRunningTime="2025-10-09 10:41:45.713313727 +0000 UTC m=+844.675514108" Oct 09 10:41:46 crc kubenswrapper[4740]: I1009 10:41:46.646737 4740 generic.go:334] "Generic (PLEG): container finished" podID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerID="fc30dfee9439bc855f997ad954abb19e854f80d89d8c07311f98ed06b80ff338" exitCode=0 Oct 09 10:41:46 crc kubenswrapper[4740]: I1009 10:41:46.648119 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" event={"ID":"d1d10dcf-922d-4d14-ac25-0b8482757670","Type":"ContainerDied","Data":"fc30dfee9439bc855f997ad954abb19e854f80d89d8c07311f98ed06b80ff338"} Oct 09 10:41:47 crc kubenswrapper[4740]: I1009 10:41:47.935497 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.038731 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-util\") pod \"d1d10dcf-922d-4d14-ac25-0b8482757670\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.038874 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54mz5\" (UniqueName: \"kubernetes.io/projected/d1d10dcf-922d-4d14-ac25-0b8482757670-kube-api-access-54mz5\") pod \"d1d10dcf-922d-4d14-ac25-0b8482757670\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.038911 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-bundle\") pod \"d1d10dcf-922d-4d14-ac25-0b8482757670\" (UID: \"d1d10dcf-922d-4d14-ac25-0b8482757670\") " Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.039861 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-bundle" (OuterVolumeSpecName: "bundle") pod "d1d10dcf-922d-4d14-ac25-0b8482757670" (UID: "d1d10dcf-922d-4d14-ac25-0b8482757670"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.045500 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d10dcf-922d-4d14-ac25-0b8482757670-kube-api-access-54mz5" (OuterVolumeSpecName: "kube-api-access-54mz5") pod "d1d10dcf-922d-4d14-ac25-0b8482757670" (UID: "d1d10dcf-922d-4d14-ac25-0b8482757670"). InnerVolumeSpecName "kube-api-access-54mz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.054974 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-util" (OuterVolumeSpecName: "util") pod "d1d10dcf-922d-4d14-ac25-0b8482757670" (UID: "d1d10dcf-922d-4d14-ac25-0b8482757670"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.068632 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.069175 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.129169 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.140669 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-util\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.140699 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54mz5\" (UniqueName: \"kubernetes.io/projected/d1d10dcf-922d-4d14-ac25-0b8482757670-kube-api-access-54mz5\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.140708 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d10dcf-922d-4d14-ac25-0b8482757670-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.663467 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" event={"ID":"d1d10dcf-922d-4d14-ac25-0b8482757670","Type":"ContainerDied","Data":"684cd3f7f065854bb60f6c7ce6df3ccf49ac23707e4640f2206915a30aa4a661"} Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.663563 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.663586 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="684cd3f7f065854bb60f6c7ce6df3ccf49ac23707e4640f2206915a30aa4a661" Oct 09 10:41:48 crc kubenswrapper[4740]: I1009 10:41:48.734966 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.351773 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l"] Oct 09 10:41:50 crc kubenswrapper[4740]: E1009 10:41:50.352054 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerName="util" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.352070 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerName="util" Oct 09 10:41:50 crc kubenswrapper[4740]: E1009 10:41:50.352092 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerName="extract" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.352100 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerName="extract" Oct 09 10:41:50 crc kubenswrapper[4740]: E1009 10:41:50.352117 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerName="pull" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.352125 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerName="pull" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.352258 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1d10dcf-922d-4d14-ac25-0b8482757670" containerName="extract" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.352961 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.357401 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-vjd2t" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.410273 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l"] Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.470881 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq65k\" (UniqueName: \"kubernetes.io/projected/e0e26b11-7270-46ec-9042-0eaab1e2a459-kube-api-access-mq65k\") pod \"openstack-operator-controller-operator-6c6ccc6df6-wxp9l\" (UID: \"e0e26b11-7270-46ec-9042-0eaab1e2a459\") " pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.572446 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq65k\" (UniqueName: \"kubernetes.io/projected/e0e26b11-7270-46ec-9042-0eaab1e2a459-kube-api-access-mq65k\") pod \"openstack-operator-controller-operator-6c6ccc6df6-wxp9l\" (UID: \"e0e26b11-7270-46ec-9042-0eaab1e2a459\") " pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.596561 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq65k\" (UniqueName: \"kubernetes.io/projected/e0e26b11-7270-46ec-9042-0eaab1e2a459-kube-api-access-mq65k\") pod \"openstack-operator-controller-operator-6c6ccc6df6-wxp9l\" (UID: \"e0e26b11-7270-46ec-9042-0eaab1e2a459\") " pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" Oct 09 10:41:50 crc kubenswrapper[4740]: I1009 10:41:50.669387 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" Oct 09 10:41:51 crc kubenswrapper[4740]: I1009 10:41:51.100495 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l"] Oct 09 10:41:51 crc kubenswrapper[4740]: I1009 10:41:51.515393 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hcpc4"] Oct 09 10:41:51 crc kubenswrapper[4740]: I1009 10:41:51.685673 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" event={"ID":"e0e26b11-7270-46ec-9042-0eaab1e2a459","Type":"ContainerStarted","Data":"a2a2f4a64395eb73040f314f1a80a64a197ea3b02695e117c4978ca1f8405617"} Oct 09 10:41:51 crc kubenswrapper[4740]: I1009 10:41:51.685825 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hcpc4" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerName="registry-server" containerID="cri-o://d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f" gracePeriod=2 Oct 09 10:41:51 crc kubenswrapper[4740]: E1009 10:41:51.821824 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3fa39f1_d9c3_4c58_acea_0dbb7abfbb12.slice/crio-conmon-d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3fa39f1_d9c3_4c58_acea_0dbb7abfbb12.slice/crio-d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f.scope\": RecentStats: unable to find data in memory cache]" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.096472 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.195963 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrz6\" (UniqueName: \"kubernetes.io/projected/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-kube-api-access-lbrz6\") pod \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.196051 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-utilities\") pod \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.196080 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-catalog-content\") pod \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\" (UID: \"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12\") " Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.197666 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-utilities" (OuterVolumeSpecName: "utilities") pod "a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" (UID: "a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.208039 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-kube-api-access-lbrz6" (OuterVolumeSpecName: "kube-api-access-lbrz6") pod "a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" (UID: "a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12"). InnerVolumeSpecName "kube-api-access-lbrz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.241450 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" (UID: "a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.297458 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbrz6\" (UniqueName: \"kubernetes.io/projected/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-kube-api-access-lbrz6\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.297486 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.297495 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.695796 4740 generic.go:334] "Generic (PLEG): container finished" podID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerID="d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f" exitCode=0 Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.695845 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcpc4" event={"ID":"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12","Type":"ContainerDied","Data":"d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f"} Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.695876 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcpc4" event={"ID":"a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12","Type":"ContainerDied","Data":"3784042afe1ca55689e41338f66d14c93f4f8913f62da32f80695c77ccab39db"} Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.695897 4740 scope.go:117] "RemoveContainer" containerID="d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.695918 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcpc4" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.738203 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hcpc4"] Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.743862 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hcpc4"] Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.852054 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.852099 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:52 crc kubenswrapper[4740]: I1009 10:41:52.895540 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:53 crc kubenswrapper[4740]: I1009 10:41:53.767702 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" path="/var/lib/kubelet/pods/a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12/volumes" Oct 09 10:41:53 crc kubenswrapper[4740]: I1009 10:41:53.770932 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:54 crc kubenswrapper[4740]: I1009 10:41:54.665958 4740 scope.go:117] "RemoveContainer" containerID="4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73" Oct 09 10:41:54 crc kubenswrapper[4740]: I1009 10:41:54.806646 4740 scope.go:117] "RemoveContainer" containerID="c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea" Oct 09 10:41:54 crc kubenswrapper[4740]: I1009 10:41:54.842287 4740 scope.go:117] "RemoveContainer" containerID="d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f" Oct 09 10:41:54 crc kubenswrapper[4740]: E1009 10:41:54.844253 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f\": container with ID starting with d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f not found: ID does not exist" containerID="d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f" Oct 09 10:41:54 crc kubenswrapper[4740]: I1009 10:41:54.844285 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f"} err="failed to get container status \"d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f\": rpc error: code = NotFound desc = could not find container \"d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f\": container with ID starting with d281fa0da01f76f42a12a713ec8e0ce151a4a01493b66909f866aaafb982dc0f not found: ID does not exist" Oct 09 10:41:54 crc kubenswrapper[4740]: I1009 10:41:54.844305 4740 scope.go:117] "RemoveContainer" containerID="4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73" Oct 09 10:41:54 crc kubenswrapper[4740]: E1009 10:41:54.844978 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73\": container with ID starting with 4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73 not found: ID does not exist" containerID="4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73" Oct 09 10:41:54 crc kubenswrapper[4740]: I1009 10:41:54.845087 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73"} err="failed to get container status \"4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73\": rpc error: code = NotFound desc = could not find container \"4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73\": container with ID starting with 4163e7c038eaab0d0c278b837c1b4d65979d3c82feb3183ad03c85640471da73 not found: ID does not exist" Oct 09 10:41:54 crc kubenswrapper[4740]: I1009 10:41:54.845122 4740 scope.go:117] "RemoveContainer" containerID="c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea" Oct 09 10:41:54 crc kubenswrapper[4740]: E1009 10:41:54.845543 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea\": container with ID starting with c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea not found: ID does not exist" containerID="c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea" Oct 09 10:41:54 crc kubenswrapper[4740]: I1009 10:41:54.845580 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea"} err="failed to get container status \"c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea\": rpc error: code = NotFound desc = could not find container \"c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea\": container with ID starting with c564df5f77b8101caa2b2bc874d75676d43f92d01a0c10eb09dcee87d325eaea not found: ID does not exist" Oct 09 10:41:55 crc kubenswrapper[4740]: I1009 10:41:55.727521 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" event={"ID":"e0e26b11-7270-46ec-9042-0eaab1e2a459","Type":"ContainerStarted","Data":"2de8170168aa3d06d7d6557ff3d0192492e9a59d17c41134e51abdeb4fbe113d"} Oct 09 10:41:56 crc kubenswrapper[4740]: I1009 10:41:56.520434 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wdj4w"] Oct 09 10:41:56 crc kubenswrapper[4740]: I1009 10:41:56.523197 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wdj4w" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerName="registry-server" containerID="cri-o://5f2690fbf5684757d40a566cf2d4dcf5e5292d5e0f1f5f697d1226cffe961e5d" gracePeriod=2 Oct 09 10:41:56 crc kubenswrapper[4740]: I1009 10:41:56.783410 4740 generic.go:334] "Generic (PLEG): container finished" podID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerID="5f2690fbf5684757d40a566cf2d4dcf5e5292d5e0f1f5f697d1226cffe961e5d" exitCode=0 Oct 09 10:41:56 crc kubenswrapper[4740]: I1009 10:41:56.783563 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdj4w" event={"ID":"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab","Type":"ContainerDied","Data":"5f2690fbf5684757d40a566cf2d4dcf5e5292d5e0f1f5f697d1226cffe961e5d"} Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.461264 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.583319 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-catalog-content\") pod \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.584076 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2ccw\" (UniqueName: \"kubernetes.io/projected/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-kube-api-access-w2ccw\") pod \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.584136 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-utilities\") pod \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\" (UID: \"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab\") " Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.585935 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-utilities" (OuterVolumeSpecName: "utilities") pod "0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" (UID: "0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.592019 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-kube-api-access-w2ccw" (OuterVolumeSpecName: "kube-api-access-w2ccw") pod "0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" (UID: "0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab"). InnerVolumeSpecName "kube-api-access-w2ccw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.634930 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" (UID: "0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.685517 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.685563 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.685582 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2ccw\" (UniqueName: \"kubernetes.io/projected/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab-kube-api-access-w2ccw\") on node \"crc\" DevicePath \"\"" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.805919 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" event={"ID":"e0e26b11-7270-46ec-9042-0eaab1e2a459","Type":"ContainerStarted","Data":"116f8ce6697300b114f1a7853bfc1725681bc10200922bba02bb4eeb4a6c8c91"} Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.806532 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.809656 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdj4w" event={"ID":"0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab","Type":"ContainerDied","Data":"f34d003c422d1b9a7a7996d0dac4b540effb30e3ac43b88acdf5d560e8a7d6e8"} Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.809695 4740 scope.go:117] "RemoveContainer" containerID="5f2690fbf5684757d40a566cf2d4dcf5e5292d5e0f1f5f697d1226cffe961e5d" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.809833 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdj4w" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.833884 4740 scope.go:117] "RemoveContainer" containerID="e78828d7a2b23eae54b4feb54c2cfcfd23d5c007e1fd384e7b89ed95e525279a" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.860372 4740 scope.go:117] "RemoveContainer" containerID="f3904b9e894d320797b29f98121b44b9c246b90cca868429090f59d9bad35e40" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.867806 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" podStartSLOduration=1.739950689 podStartE2EDuration="7.86779209s" podCreationTimestamp="2025-10-09 10:41:50 +0000 UTC" firstStartedPulling="2025-10-09 10:41:51.1159595 +0000 UTC m=+850.078159881" lastFinishedPulling="2025-10-09 10:41:57.243800901 +0000 UTC m=+856.206001282" observedRunningTime="2025-10-09 10:41:57.865187207 +0000 UTC m=+856.827387588" watchObservedRunningTime="2025-10-09 10:41:57.86779209 +0000 UTC m=+856.829992471" Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.885458 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wdj4w"] Oct 09 10:41:57 crc kubenswrapper[4740]: I1009 10:41:57.889161 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wdj4w"] Oct 09 10:41:59 crc kubenswrapper[4740]: I1009 10:41:59.766450 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" path="/var/lib/kubelet/pods/0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab/volumes" Oct 09 10:42:00 crc kubenswrapper[4740]: I1009 10:42:00.673338 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6c6ccc6df6-wxp9l" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.753713 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv"] Oct 09 10:42:18 crc kubenswrapper[4740]: E1009 10:42:18.754723 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerName="registry-server" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.754745 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerName="registry-server" Oct 09 10:42:18 crc kubenswrapper[4740]: E1009 10:42:18.754846 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerName="extract-content" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.754859 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerName="extract-content" Oct 09 10:42:18 crc kubenswrapper[4740]: E1009 10:42:18.754875 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerName="extract-utilities" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.754888 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerName="extract-utilities" Oct 09 10:42:18 crc kubenswrapper[4740]: E1009 10:42:18.754906 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerName="extract-utilities" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.754917 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerName="extract-utilities" Oct 09 10:42:18 crc kubenswrapper[4740]: E1009 10:42:18.754937 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerName="extract-content" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.754949 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerName="extract-content" Oct 09 10:42:18 crc kubenswrapper[4740]: E1009 10:42:18.754969 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerName="registry-server" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.754980 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerName="registry-server" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.755177 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3fa39f1-d9c3-4c58-acea-0dbb7abfbb12" containerName="registry-server" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.755199 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf1fbb5-80bc-4b41-86bc-7fa73c2d30ab" containerName="registry-server" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.756171 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" Oct 09 10:42:18 crc kubenswrapper[4740]: W1009 10:42:18.757875 4740 reflector.go:561] object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9pwf2": failed to list *v1.Secret: secrets "barbican-operator-controller-manager-dockercfg-9pwf2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Oct 09 10:42:18 crc kubenswrapper[4740]: E1009 10:42:18.757927 4740 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"barbican-operator-controller-manager-dockercfg-9pwf2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"barbican-operator-controller-manager-dockercfg-9pwf2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.758285 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.759610 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.760882 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fsf94" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.773835 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.778503 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.784876 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.785956 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.787659 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-v4ld6" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.794184 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h2hp\" (UniqueName: \"kubernetes.io/projected/95b86671-972c-4a57-b68b-0421b82bd3d4-kube-api-access-2h2hp\") pod \"cinder-operator-controller-manager-59cdc64769-mp2p4\" (UID: \"95b86671-972c-4a57-b68b-0421b82bd3d4\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.794228 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n65l4\" (UniqueName: \"kubernetes.io/projected/93f4faa8-4d5e-48d9-ac5a-bb1468f972d3-kube-api-access-n65l4\") pod \"barbican-operator-controller-manager-64f84fcdbb-jqnnv\" (UID: \"93f4faa8-4d5e-48d9-ac5a-bb1468f972d3\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.794351 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xpcd\" (UniqueName: \"kubernetes.io/projected/2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8-kube-api-access-9xpcd\") pod \"designate-operator-controller-manager-687df44cdb-w2ftw\" (UID: \"2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.800242 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.803688 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.805185 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.806843 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-2shg4" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.813624 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.814520 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.822057 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qwrfz" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.839933 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.848098 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.849343 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.852838 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mc2lf" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.854273 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.862452 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.866041 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.868386 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.869604 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.873858 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.874055 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wcztv" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.878285 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.878775 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8zrw8" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.894818 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.894892 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xpcd\" (UniqueName: \"kubernetes.io/projected/2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8-kube-api-access-9xpcd\") pod \"designate-operator-controller-manager-687df44cdb-w2ftw\" (UID: \"2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.896464 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h2hp\" (UniqueName: \"kubernetes.io/projected/95b86671-972c-4a57-b68b-0421b82bd3d4-kube-api-access-2h2hp\") pod \"cinder-operator-controller-manager-59cdc64769-mp2p4\" (UID: \"95b86671-972c-4a57-b68b-0421b82bd3d4\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.896548 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n65l4\" (UniqueName: \"kubernetes.io/projected/93f4faa8-4d5e-48d9-ac5a-bb1468f972d3-kube-api-access-n65l4\") pod \"barbican-operator-controller-manager-64f84fcdbb-jqnnv\" (UID: \"93f4faa8-4d5e-48d9-ac5a-bb1468f972d3\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.904027 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.921179 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.922874 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.923957 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xpcd\" (UniqueName: \"kubernetes.io/projected/2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8-kube-api-access-9xpcd\") pod \"designate-operator-controller-manager-687df44cdb-w2ftw\" (UID: \"2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.931445 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.935266 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rmm5b" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.936896 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-6jtst"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.943598 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n65l4\" (UniqueName: \"kubernetes.io/projected/93f4faa8-4d5e-48d9-ac5a-bb1468f972d3-kube-api-access-n65l4\") pod \"barbican-operator-controller-manager-64f84fcdbb-jqnnv\" (UID: \"93f4faa8-4d5e-48d9-ac5a-bb1468f972d3\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.947118 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h2hp\" (UniqueName: \"kubernetes.io/projected/95b86671-972c-4a57-b68b-0421b82bd3d4-kube-api-access-2h2hp\") pod \"cinder-operator-controller-manager-59cdc64769-mp2p4\" (UID: \"95b86671-972c-4a57-b68b-0421b82bd3d4\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.947174 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.950684 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-6jtst"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.953981 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.955557 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-gfq7l" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.957542 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.963633 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2"] Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.964811 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.967293 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-924lt" Oct 09 10:42:18 crc kubenswrapper[4740]: I1009 10:42:18.967650 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-p4q9h" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.012999 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jtpp\" (UniqueName: \"kubernetes.io/projected/27b8cb71-5bd2-4133-bf5a-db571521861b-kube-api-access-7jtpp\") pod \"infra-operator-controller-manager-585fc5b659-4zsvx\" (UID: \"27b8cb71-5bd2-4133-bf5a-db571521861b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.013046 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4p2x\" (UniqueName: \"kubernetes.io/projected/1519e3af-34c9-4722-9aaa-8a10ef0d49de-kube-api-access-t4p2x\") pod \"heat-operator-controller-manager-6d9967f8dd-sf7cf\" (UID: \"1519e3af-34c9-4722-9aaa-8a10ef0d49de\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.013069 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27b8cb71-5bd2-4133-bf5a-db571521861b-cert\") pod \"infra-operator-controller-manager-585fc5b659-4zsvx\" (UID: \"27b8cb71-5bd2-4133-bf5a-db571521861b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.013129 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmjbd\" (UniqueName: \"kubernetes.io/projected/8ae60958-f755-47fd-891b-74356bff787c-kube-api-access-hmjbd\") pod \"ironic-operator-controller-manager-74cb5cbc49-prf5f\" (UID: \"8ae60958-f755-47fd-891b-74356bff787c\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.013149 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg2cx\" (UniqueName: \"kubernetes.io/projected/a3c08e43-cc8b-433e-ba8e-fd225eef09ed-kube-api-access-cg2cx\") pod \"glance-operator-controller-manager-7bb46cd7d-268g9\" (UID: \"a3c08e43-cc8b-433e-ba8e-fd225eef09ed\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.013173 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lqs6\" (UniqueName: \"kubernetes.io/projected/5348e551-de55-4c32-af1e-ac9facc061d9-kube-api-access-9lqs6\") pod \"horizon-operator-controller-manager-6d74794d9b-p4btw\" (UID: \"5348e551-de55-4c32-af1e-ac9facc061d9\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.014174 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.030062 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.079351 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.081564 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.085547 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-c84k8" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.093894 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.095128 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.100063 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cbfws" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.100540 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121417 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg2cx\" (UniqueName: \"kubernetes.io/projected/a3c08e43-cc8b-433e-ba8e-fd225eef09ed-kube-api-access-cg2cx\") pod \"glance-operator-controller-manager-7bb46cd7d-268g9\" (UID: \"a3c08e43-cc8b-433e-ba8e-fd225eef09ed\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121486 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsfxv\" (UniqueName: \"kubernetes.io/projected/8313eb28-2711-404c-817c-b782ea1cf41a-kube-api-access-bsfxv\") pod \"mariadb-operator-controller-manager-5777b4f897-cz2dz\" (UID: \"8313eb28-2711-404c-817c-b782ea1cf41a\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121516 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lqs6\" (UniqueName: \"kubernetes.io/projected/5348e551-de55-4c32-af1e-ac9facc061d9-kube-api-access-9lqs6\") pod \"horizon-operator-controller-manager-6d74794d9b-p4btw\" (UID: \"5348e551-de55-4c32-af1e-ac9facc061d9\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121544 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jjf\" (UniqueName: \"kubernetes.io/projected/f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1-kube-api-access-48jjf\") pod \"manila-operator-controller-manager-59578bc799-6jtst\" (UID: \"f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121580 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jtpp\" (UniqueName: \"kubernetes.io/projected/27b8cb71-5bd2-4133-bf5a-db571521861b-kube-api-access-7jtpp\") pod \"infra-operator-controller-manager-585fc5b659-4zsvx\" (UID: \"27b8cb71-5bd2-4133-bf5a-db571521861b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121614 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4p2x\" (UniqueName: \"kubernetes.io/projected/1519e3af-34c9-4722-9aaa-8a10ef0d49de-kube-api-access-t4p2x\") pod \"heat-operator-controller-manager-6d9967f8dd-sf7cf\" (UID: \"1519e3af-34c9-4722-9aaa-8a10ef0d49de\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121643 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27b8cb71-5bd2-4133-bf5a-db571521861b-cert\") pod \"infra-operator-controller-manager-585fc5b659-4zsvx\" (UID: \"27b8cb71-5bd2-4133-bf5a-db571521861b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121678 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfq9q\" (UniqueName: \"kubernetes.io/projected/254d742d-881a-4ea9-97fd-2246d7109a77-kube-api-access-lfq9q\") pod \"neutron-operator-controller-manager-797d478b46-h4lw2\" (UID: \"254d742d-881a-4ea9-97fd-2246d7109a77\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.121704 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkgq2\" (UniqueName: \"kubernetes.io/projected/67fd364b-d05e-4d57-a817-3f64be5cdba0-kube-api-access-nkgq2\") pod \"keystone-operator-controller-manager-ddb98f99b-hsl94\" (UID: \"67fd364b-d05e-4d57-a817-3f64be5cdba0\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.122325 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmjbd\" (UniqueName: \"kubernetes.io/projected/8ae60958-f755-47fd-891b-74356bff787c-kube-api-access-hmjbd\") pod \"ironic-operator-controller-manager-74cb5cbc49-prf5f\" (UID: \"8ae60958-f755-47fd-891b-74356bff787c\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" Oct 09 10:42:19 crc kubenswrapper[4740]: E1009 10:42:19.122644 4740 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 09 10:42:19 crc kubenswrapper[4740]: E1009 10:42:19.122700 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b8cb71-5bd2-4133-bf5a-db571521861b-cert podName:27b8cb71-5bd2-4133-bf5a-db571521861b nodeName:}" failed. No retries permitted until 2025-10-09 10:42:19.622678272 +0000 UTC m=+878.584878653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/27b8cb71-5bd2-4133-bf5a-db571521861b-cert") pod "infra-operator-controller-manager-585fc5b659-4zsvx" (UID: "27b8cb71-5bd2-4133-bf5a-db571521861b") : secret "infra-operator-webhook-server-cert" not found Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.130074 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.130666 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.144908 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.145831 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.149013 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5p84b" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.149186 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.171138 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4p2x\" (UniqueName: \"kubernetes.io/projected/1519e3af-34c9-4722-9aaa-8a10ef0d49de-kube-api-access-t4p2x\") pod \"heat-operator-controller-manager-6d9967f8dd-sf7cf\" (UID: \"1519e3af-34c9-4722-9aaa-8a10ef0d49de\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.171211 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.177082 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmjbd\" (UniqueName: \"kubernetes.io/projected/8ae60958-f755-47fd-891b-74356bff787c-kube-api-access-hmjbd\") pod \"ironic-operator-controller-manager-74cb5cbc49-prf5f\" (UID: \"8ae60958-f755-47fd-891b-74356bff787c\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.183325 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg2cx\" (UniqueName: \"kubernetes.io/projected/a3c08e43-cc8b-433e-ba8e-fd225eef09ed-kube-api-access-cg2cx\") pod \"glance-operator-controller-manager-7bb46cd7d-268g9\" (UID: \"a3c08e43-cc8b-433e-ba8e-fd225eef09ed\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.201689 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jtpp\" (UniqueName: \"kubernetes.io/projected/27b8cb71-5bd2-4133-bf5a-db571521861b-kube-api-access-7jtpp\") pod \"infra-operator-controller-manager-585fc5b659-4zsvx\" (UID: \"27b8cb71-5bd2-4133-bf5a-db571521861b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.222092 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.222991 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx\" (UID: \"3179f3c7-2f14-494b-9fea-3c217a11af2b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.223031 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpwk5\" (UniqueName: \"kubernetes.io/projected/3179f3c7-2f14-494b-9fea-3c217a11af2b-kube-api-access-zpwk5\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx\" (UID: \"3179f3c7-2f14-494b-9fea-3c217a11af2b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.223086 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29csl\" (UniqueName: \"kubernetes.io/projected/a9f75d3c-e107-48aa-b15b-442b785b8945-kube-api-access-29csl\") pod \"octavia-operator-controller-manager-6d7c7ddf95-8nkmb\" (UID: \"a9f75d3c-e107-48aa-b15b-442b785b8945\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.223113 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfq9q\" (UniqueName: \"kubernetes.io/projected/254d742d-881a-4ea9-97fd-2246d7109a77-kube-api-access-lfq9q\") pod \"neutron-operator-controller-manager-797d478b46-h4lw2\" (UID: \"254d742d-881a-4ea9-97fd-2246d7109a77\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.223141 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkgq2\" (UniqueName: \"kubernetes.io/projected/67fd364b-d05e-4d57-a817-3f64be5cdba0-kube-api-access-nkgq2\") pod \"keystone-operator-controller-manager-ddb98f99b-hsl94\" (UID: \"67fd364b-d05e-4d57-a817-3f64be5cdba0\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.223187 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvqg9\" (UniqueName: \"kubernetes.io/projected/676e4e26-21ec-4b2c-ab3f-bc593cddfb33-kube-api-access-rvqg9\") pod \"nova-operator-controller-manager-57bb74c7bf-kv5jg\" (UID: \"676e4e26-21ec-4b2c-ab3f-bc593cddfb33\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.223229 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsfxv\" (UniqueName: \"kubernetes.io/projected/8313eb28-2711-404c-817c-b782ea1cf41a-kube-api-access-bsfxv\") pod \"mariadb-operator-controller-manager-5777b4f897-cz2dz\" (UID: \"8313eb28-2711-404c-817c-b782ea1cf41a\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.223269 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48jjf\" (UniqueName: \"kubernetes.io/projected/f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1-kube-api-access-48jjf\") pod \"manila-operator-controller-manager-59578bc799-6jtst\" (UID: \"f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.228928 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.232641 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lqs6\" (UniqueName: \"kubernetes.io/projected/5348e551-de55-4c32-af1e-ac9facc061d9-kube-api-access-9lqs6\") pod \"horizon-operator-controller-manager-6d74794d9b-p4btw\" (UID: \"5348e551-de55-4c32-af1e-ac9facc061d9\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.254192 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.255134 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.257914 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nlcn4" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.260877 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkgq2\" (UniqueName: \"kubernetes.io/projected/67fd364b-d05e-4d57-a817-3f64be5cdba0-kube-api-access-nkgq2\") pod \"keystone-operator-controller-manager-ddb98f99b-hsl94\" (UID: \"67fd364b-d05e-4d57-a817-3f64be5cdba0\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.264921 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jjf\" (UniqueName: \"kubernetes.io/projected/f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1-kube-api-access-48jjf\") pod \"manila-operator-controller-manager-59578bc799-6jtst\" (UID: \"f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.272588 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsfxv\" (UniqueName: \"kubernetes.io/projected/8313eb28-2711-404c-817c-b782ea1cf41a-kube-api-access-bsfxv\") pod \"mariadb-operator-controller-manager-5777b4f897-cz2dz\" (UID: \"8313eb28-2711-404c-817c-b782ea1cf41a\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.280861 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfq9q\" (UniqueName: \"kubernetes.io/projected/254d742d-881a-4ea9-97fd-2246d7109a77-kube-api-access-lfq9q\") pod \"neutron-operator-controller-manager-797d478b46-h4lw2\" (UID: \"254d742d-881a-4ea9-97fd-2246d7109a77\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.304946 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.305850 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.314183 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-2wtgr" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.323962 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvqg9\" (UniqueName: \"kubernetes.io/projected/676e4e26-21ec-4b2c-ab3f-bc593cddfb33-kube-api-access-rvqg9\") pod \"nova-operator-controller-manager-57bb74c7bf-kv5jg\" (UID: \"676e4e26-21ec-4b2c-ab3f-bc593cddfb33\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.324037 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx\" (UID: \"3179f3c7-2f14-494b-9fea-3c217a11af2b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.324058 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpwk5\" (UniqueName: \"kubernetes.io/projected/3179f3c7-2f14-494b-9fea-3c217a11af2b-kube-api-access-zpwk5\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx\" (UID: \"3179f3c7-2f14-494b-9fea-3c217a11af2b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.324097 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29csl\" (UniqueName: \"kubernetes.io/projected/a9f75d3c-e107-48aa-b15b-442b785b8945-kube-api-access-29csl\") pod \"octavia-operator-controller-manager-6d7c7ddf95-8nkmb\" (UID: \"a9f75d3c-e107-48aa-b15b-442b785b8945\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.324132 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pktrx\" (UniqueName: \"kubernetes.io/projected/7e7f599f-1cc9-41fc-b683-8b0de6e48761-kube-api-access-pktrx\") pod \"ovn-operator-controller-manager-6f96f8c84-r494n\" (UID: \"7e7f599f-1cc9-41fc-b683-8b0de6e48761\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" Oct 09 10:42:19 crc kubenswrapper[4740]: E1009 10:42:19.324652 4740 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 10:42:19 crc kubenswrapper[4740]: E1009 10:42:19.324691 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert podName:3179f3c7-2f14-494b-9fea-3c217a11af2b nodeName:}" failed. No retries permitted until 2025-10-09 10:42:19.824677432 +0000 UTC m=+878.786877813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" (UID: "3179f3c7-2f14-494b-9fea-3c217a11af2b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.334600 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.335572 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.336658 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.339716 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.367105 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.371287 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.371314 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-9lpgf" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.396737 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvqg9\" (UniqueName: \"kubernetes.io/projected/676e4e26-21ec-4b2c-ab3f-bc593cddfb33-kube-api-access-rvqg9\") pod \"nova-operator-controller-manager-57bb74c7bf-kv5jg\" (UID: \"676e4e26-21ec-4b2c-ab3f-bc593cddfb33\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.397905 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29csl\" (UniqueName: \"kubernetes.io/projected/a9f75d3c-e107-48aa-b15b-442b785b8945-kube-api-access-29csl\") pod \"octavia-operator-controller-manager-6d7c7ddf95-8nkmb\" (UID: \"a9f75d3c-e107-48aa-b15b-442b785b8945\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.402877 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpwk5\" (UniqueName: \"kubernetes.io/projected/3179f3c7-2f14-494b-9fea-3c217a11af2b-kube-api-access-zpwk5\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx\" (UID: \"3179f3c7-2f14-494b-9fea-3c217a11af2b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.406496 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.406848 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.426036 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkgx2\" (UniqueName: \"kubernetes.io/projected/1a6180f0-55bd-4c7e-a96c-97762cace534-kube-api-access-kkgx2\") pod \"swift-operator-controller-manager-5f4d5dfdc6-gg5c4\" (UID: \"1a6180f0-55bd-4c7e-a96c-97762cace534\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.426087 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pktrx\" (UniqueName: \"kubernetes.io/projected/7e7f599f-1cc9-41fc-b683-8b0de6e48761-kube-api-access-pktrx\") pod \"ovn-operator-controller-manager-6f96f8c84-r494n\" (UID: \"7e7f599f-1cc9-41fc-b683-8b0de6e48761\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.426186 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggls\" (UniqueName: \"kubernetes.io/projected/f1909d9f-c6e3-4c55-93f5-be679e3c3792-kube-api-access-8ggls\") pod \"placement-operator-controller-manager-664664cb68-q5tlk\" (UID: \"f1909d9f-c6e3-4c55-93f5-be679e3c3792\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.440258 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.450046 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.451822 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pktrx\" (UniqueName: \"kubernetes.io/projected/7e7f599f-1cc9-41fc-b683-8b0de6e48761-kube-api-access-pktrx\") pod \"ovn-operator-controller-manager-6f96f8c84-r494n\" (UID: \"7e7f599f-1cc9-41fc-b683-8b0de6e48761\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.454258 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.455007 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.456627 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.462302 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-p67n7" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.479612 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.490202 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.504781 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.506296 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.508708 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fzr5w" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.526975 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggls\" (UniqueName: \"kubernetes.io/projected/f1909d9f-c6e3-4c55-93f5-be679e3c3792-kube-api-access-8ggls\") pod \"placement-operator-controller-manager-664664cb68-q5tlk\" (UID: \"f1909d9f-c6e3-4c55-93f5-be679e3c3792\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.527085 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkgx2\" (UniqueName: \"kubernetes.io/projected/1a6180f0-55bd-4c7e-a96c-97762cace534-kube-api-access-kkgx2\") pod \"swift-operator-controller-manager-5f4d5dfdc6-gg5c4\" (UID: \"1a6180f0-55bd-4c7e-a96c-97762cace534\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.527125 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44zzr\" (UniqueName: \"kubernetes.io/projected/594a8f26-8acc-44a8-b024-665012e570f6-kube-api-access-44zzr\") pod \"telemetry-operator-controller-manager-578874c84d-6fng6\" (UID: \"594a8f26-8acc-44a8-b024-665012e570f6\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.549376 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.550528 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkgx2\" (UniqueName: \"kubernetes.io/projected/1a6180f0-55bd-4c7e-a96c-97762cace534-kube-api-access-kkgx2\") pod \"swift-operator-controller-manager-5f4d5dfdc6-gg5c4\" (UID: \"1a6180f0-55bd-4c7e-a96c-97762cace534\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.551588 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggls\" (UniqueName: \"kubernetes.io/projected/f1909d9f-c6e3-4c55-93f5-be679e3c3792-kube-api-access-8ggls\") pod \"placement-operator-controller-manager-664664cb68-q5tlk\" (UID: \"f1909d9f-c6e3-4c55-93f5-be679e3c3792\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.574835 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.575820 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.583306 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.588151 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6xd7z" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.592433 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.602867 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.629093 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27b8cb71-5bd2-4133-bf5a-db571521861b-cert\") pod \"infra-operator-controller-manager-585fc5b659-4zsvx\" (UID: \"27b8cb71-5bd2-4133-bf5a-db571521861b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.629239 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44zzr\" (UniqueName: \"kubernetes.io/projected/594a8f26-8acc-44a8-b024-665012e570f6-kube-api-access-44zzr\") pod \"telemetry-operator-controller-manager-578874c84d-6fng6\" (UID: \"594a8f26-8acc-44a8-b024-665012e570f6\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.629342 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8sf9\" (UniqueName: \"kubernetes.io/projected/d454e0e1-1745-4fc0-aea1-9d231de7fa65-kube-api-access-p8sf9\") pod \"test-operator-controller-manager-ffcdd6c94-mxz8d\" (UID: \"d454e0e1-1745-4fc0-aea1-9d231de7fa65\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.639160 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27b8cb71-5bd2-4133-bf5a-db571521861b-cert\") pod \"infra-operator-controller-manager-585fc5b659-4zsvx\" (UID: \"27b8cb71-5bd2-4133-bf5a-db571521861b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.643521 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.645510 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.652273 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fvgz8" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.652040 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.652412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44zzr\" (UniqueName: \"kubernetes.io/projected/594a8f26-8acc-44a8-b024-665012e570f6-kube-api-access-44zzr\") pod \"telemetry-operator-controller-manager-578874c84d-6fng6\" (UID: \"594a8f26-8acc-44a8-b024-665012e570f6\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.653627 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.672056 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.694209 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.696889 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.706461 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.710339 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.715250 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-hkgp9" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.726868 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.730422 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd1eb7dc-bc88-4a8e-b681-751ebdf2089f-cert\") pod \"openstack-operator-controller-manager-5647484f69-cxbqt\" (UID: \"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f\") " pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.730560 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glpfr\" (UniqueName: \"kubernetes.io/projected/cd1eb7dc-bc88-4a8e-b681-751ebdf2089f-kube-api-access-glpfr\") pod \"openstack-operator-controller-manager-5647484f69-cxbqt\" (UID: \"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f\") " pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.730673 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8sf9\" (UniqueName: \"kubernetes.io/projected/d454e0e1-1745-4fc0-aea1-9d231de7fa65-kube-api-access-p8sf9\") pod \"test-operator-controller-manager-ffcdd6c94-mxz8d\" (UID: \"d454e0e1-1745-4fc0-aea1-9d231de7fa65\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.730719 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfdrm\" (UniqueName: \"kubernetes.io/projected/65d851bd-9407-48da-bac4-d3b07bab1d46-kube-api-access-tfdrm\") pod \"watcher-operator-controller-manager-646675d848-ngw9n\" (UID: \"65d851bd-9407-48da-bac4-d3b07bab1d46\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.771097 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8sf9\" (UniqueName: \"kubernetes.io/projected/d454e0e1-1745-4fc0-aea1-9d231de7fa65-kube-api-access-p8sf9\") pod \"test-operator-controller-manager-ffcdd6c94-mxz8d\" (UID: \"d454e0e1-1745-4fc0-aea1-9d231de7fa65\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.783723 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.790478 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.833551 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd1eb7dc-bc88-4a8e-b681-751ebdf2089f-cert\") pod \"openstack-operator-controller-manager-5647484f69-cxbqt\" (UID: \"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f\") " pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.833595 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx\" (UID: \"3179f3c7-2f14-494b-9fea-3c217a11af2b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.833709 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqg7l\" (UniqueName: \"kubernetes.io/projected/00c4b19b-1c03-4fc2-9ac1-39ca45ca9570-kube-api-access-rqg7l\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g\" (UID: \"00c4b19b-1c03-4fc2-9ac1-39ca45ca9570\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.833779 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glpfr\" (UniqueName: \"kubernetes.io/projected/cd1eb7dc-bc88-4a8e-b681-751ebdf2089f-kube-api-access-glpfr\") pod \"openstack-operator-controller-manager-5647484f69-cxbqt\" (UID: \"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f\") " pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.833822 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfdrm\" (UniqueName: \"kubernetes.io/projected/65d851bd-9407-48da-bac4-d3b07bab1d46-kube-api-access-tfdrm\") pod \"watcher-operator-controller-manager-646675d848-ngw9n\" (UID: \"65d851bd-9407-48da-bac4-d3b07bab1d46\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" Oct 09 10:42:19 crc kubenswrapper[4740]: E1009 10:42:19.835400 4740 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 10:42:19 crc kubenswrapper[4740]: E1009 10:42:19.835446 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert podName:3179f3c7-2f14-494b-9fea-3c217a11af2b nodeName:}" failed. No retries permitted until 2025-10-09 10:42:20.835432026 +0000 UTC m=+879.797632407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" (UID: "3179f3c7-2f14-494b-9fea-3c217a11af2b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.844778 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.846566 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd1eb7dc-bc88-4a8e-b681-751ebdf2089f-cert\") pod \"openstack-operator-controller-manager-5647484f69-cxbqt\" (UID: \"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f\") " pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.862761 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glpfr\" (UniqueName: \"kubernetes.io/projected/cd1eb7dc-bc88-4a8e-b681-751ebdf2089f-kube-api-access-glpfr\") pod \"openstack-operator-controller-manager-5647484f69-cxbqt\" (UID: \"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f\") " pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.863912 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfdrm\" (UniqueName: \"kubernetes.io/projected/65d851bd-9407-48da-bac4-d3b07bab1d46-kube-api-access-tfdrm\") pod \"watcher-operator-controller-manager-646675d848-ngw9n\" (UID: \"65d851bd-9407-48da-bac4-d3b07bab1d46\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.886542 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4"] Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.907337 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.922704 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.934717 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqg7l\" (UniqueName: \"kubernetes.io/projected/00c4b19b-1c03-4fc2-9ac1-39ca45ca9570-kube-api-access-rqg7l\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g\" (UID: \"00c4b19b-1c03-4fc2-9ac1-39ca45ca9570\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.960293 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqg7l\" (UniqueName: \"kubernetes.io/projected/00c4b19b-1c03-4fc2-9ac1-39ca45ca9570-kube-api-access-rqg7l\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g\" (UID: \"00c4b19b-1c03-4fc2-9ac1-39ca45ca9570\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g" Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.963745 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" event={"ID":"2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8","Type":"ContainerStarted","Data":"00fccdc0962b6cc5a4848315abdd2d1905bf3637928c209d3620f60d0ed3cdff"} Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.967110 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" event={"ID":"95b86671-972c-4a57-b68b-0421b82bd3d4","Type":"ContainerStarted","Data":"4bafd04c2ce6f475c54b6f22df6769f9dea28b4e049ef5159e18fd5bcd68464e"} Oct 09 10:42:19 crc kubenswrapper[4740]: I1009 10:42:19.968118 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.020948 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.087434 4740 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" secret="" err="failed to sync secret cache: timed out waiting for the condition" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.087527 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.088950 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.138282 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9pwf2" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.330681 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-6jtst"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.338532 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.345106 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.349281 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.475105 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.479856 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.482442 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9"] Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.482898 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5348e551_de55_4c32_af1e_ac9facc061d9.slice/crio-f1e3add5da9cf023178ebd1478452f93ea17b438f34f19ee4d4204b0723ca549 WatchSource:0}: Error finding container f1e3add5da9cf023178ebd1478452f93ea17b438f34f19ee4d4204b0723ca549: Status 404 returned error can't find the container with id f1e3add5da9cf023178ebd1478452f93ea17b438f34f19ee4d4204b0723ca549 Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.491646 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f75d3c_e107_48aa_b15b_442b785b8945.slice/crio-26d986da2fd42e80a2f31f0f266c180f243952204b93c8a2b0577ae98ce5e060 WatchSource:0}: Error finding container 26d986da2fd42e80a2f31f0f266c180f243952204b93c8a2b0577ae98ce5e060: Status 404 returned error can't find the container with id 26d986da2fd42e80a2f31f0f266c180f243952204b93c8a2b0577ae98ce5e060 Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.492365 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c08e43_cc8b_433e_ba8e_fd225eef09ed.slice/crio-2430c9180522756d731b91fcad186720ef7e7ff7db8dc260c56d10b473c59978 WatchSource:0}: Error finding container 2430c9180522756d731b91fcad186720ef7e7ff7db8dc260c56d10b473c59978: Status 404 returned error can't find the container with id 2430c9180522756d731b91fcad186720ef7e7ff7db8dc260c56d10b473c59978 Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.554473 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.558936 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6"] Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.560815 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1909d9f_c6e3_4c55_93f5_be679e3c3792.slice/crio-99bc328b8c879729a985e754e39d21782d8d0bb08c6e985e3343db8acd24e663 WatchSource:0}: Error finding container 99bc328b8c879729a985e754e39d21782d8d0bb08c6e985e3343db8acd24e663: Status 404 returned error can't find the container with id 99bc328b8c879729a985e754e39d21782d8d0bb08c6e985e3343db8acd24e663 Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.566402 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod594a8f26_8acc_44a8_b024_665012e570f6.slice/crio-777ed7ac969ccd287b3874043166c0a6351bd0bbb3d26e2417fda4f820e76b7a WatchSource:0}: Error finding container 777ed7ac969ccd287b3874043166c0a6351bd0bbb3d26e2417fda4f820e76b7a: Status 404 returned error can't find the container with id 777ed7ac969ccd287b3874043166c0a6351bd0bbb3d26e2417fda4f820e76b7a Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.729445 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.738495 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g"] Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.753204 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65d851bd_9407_48da_bac4_d3b07bab1d46.slice/crio-3b132f03b320d5bd0966fd739fbc0e2551af30f601fa79b3dd829f73059eeaf2 WatchSource:0}: Error finding container 3b132f03b320d5bd0966fd739fbc0e2551af30f601fa79b3dd829f73059eeaf2: Status 404 returned error can't find the container with id 3b132f03b320d5bd0966fd739fbc0e2551af30f601fa79b3dd829f73059eeaf2 Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.762780 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.764627 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.775952 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.778589 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.788241 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.795702 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv"] Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.796975 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27b8cb71_5bd2_4133_bf5a_db571521861b.slice/crio-d25054f9d1488bee4e5585cb2774e038d2b18bbdf8981cfa9b454be9e062c02e WatchSource:0}: Error finding container d25054f9d1488bee4e5585cb2774e038d2b18bbdf8981cfa9b454be9e062c02e: Status 404 returned error can't find the container with id d25054f9d1488bee4e5585cb2774e038d2b18bbdf8981cfa9b454be9e062c02e Oct 09 10:42:20 crc kubenswrapper[4740]: E1009 10:42:20.800290 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p8sf9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-ffcdd6c94-mxz8d_openstack-operators(d454e0e1-1745-4fc0-aea1-9d231de7fa65): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.803457 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2"] Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.807607 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n"] Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.808019 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a6180f0_55bd_4c7e_a96c_97762cace534.slice/crio-b02c614541db9928015f90d09b291442a6c18fbe6ce572a1323d4bc3e29b17fe WatchSource:0}: Error finding container b02c614541db9928015f90d09b291442a6c18fbe6ce572a1323d4bc3e29b17fe: Status 404 returned error can't find the container with id b02c614541db9928015f90d09b291442a6c18fbe6ce572a1323d4bc3e29b17fe Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.813366 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e7f599f_1cc9_41fc_b683_8b0de6e48761.slice/crio-f83ef56305d99974c7a65cc2ff2d77c11dd9158c960555c6f6e79dd1d957f9f5 WatchSource:0}: Error finding container f83ef56305d99974c7a65cc2ff2d77c11dd9158c960555c6f6e79dd1d957f9f5: Status 404 returned error can't find the container with id f83ef56305d99974c7a65cc2ff2d77c11dd9158c960555c6f6e79dd1d957f9f5 Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.814158 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676e4e26_21ec_4b2c_ab3f_bc593cddfb33.slice/crio-86a97eb9a86d5846598834c24bf3bd8fc45a9bd899b0ad9c51815644c62ce627 WatchSource:0}: Error finding container 86a97eb9a86d5846598834c24bf3bd8fc45a9bd899b0ad9c51815644c62ce627: Status 404 returned error can't find the container with id 86a97eb9a86d5846598834c24bf3bd8fc45a9bd899b0ad9c51815644c62ce627 Oct 09 10:42:20 crc kubenswrapper[4740]: E1009 10:42:20.816703 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rvqg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-57bb74c7bf-kv5jg_openstack-operators(676e4e26-21ec-4b2c-ab3f-bc593cddfb33): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 10:42:20 crc kubenswrapper[4740]: E1009 10:42:20.817359 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pktrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-6f96f8c84-r494n_openstack-operators(7e7f599f-1cc9-41fc-b683-8b0de6e48761): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.819475 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod254d742d_881a_4ea9_97fd_2246d7109a77.slice/crio-49cbcdc10acf38efeedc51d23ca6b03c4720e447a3e217853e07985ab0e646b4 WatchSource:0}: Error finding container 49cbcdc10acf38efeedc51d23ca6b03c4720e447a3e217853e07985ab0e646b4: Status 404 returned error can't find the container with id 49cbcdc10acf38efeedc51d23ca6b03c4720e447a3e217853e07985ab0e646b4 Oct 09 10:42:20 crc kubenswrapper[4740]: W1009 10:42:20.821945 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f4faa8_4d5e_48d9_ac5a_bb1468f972d3.slice/crio-806df035ecc0cd8225d30d6860b7e46d7036502d3a6a1aaa6bb3ae5a7b169d7b WatchSource:0}: Error finding container 806df035ecc0cd8225d30d6860b7e46d7036502d3a6a1aaa6bb3ae5a7b169d7b: Status 404 returned error can't find the container with id 806df035ecc0cd8225d30d6860b7e46d7036502d3a6a1aaa6bb3ae5a7b169d7b Oct 09 10:42:20 crc kubenswrapper[4740]: E1009 10:42:20.832021 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:783f711b4cb179819cfcb81167c3591c70671440f4551bbe48b7a8730567f577,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n65l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-64f84fcdbb-jqnnv_openstack-operators(93f4faa8-4d5e-48d9-ac5a-bb1468f972d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 10:42:20 crc kubenswrapper[4740]: E1009 10:42:20.832064 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfq9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-797d478b46-h4lw2_openstack-operators(254d742d-881a-4ea9-97fd-2246d7109a77): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 10:42:20 crc kubenswrapper[4740]: E1009 10:42:20.832186 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kkgx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f4d5dfdc6-gg5c4_openstack-operators(1a6180f0-55bd-4c7e-a96c-97762cace534): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.852279 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx\" (UID: \"3179f3c7-2f14-494b-9fea-3c217a11af2b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.867511 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3179f3c7-2f14-494b-9fea-3c217a11af2b-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx\" (UID: \"3179f3c7-2f14-494b-9fea-3c217a11af2b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:20 crc kubenswrapper[4740]: I1009 10:42:20.988022 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" event={"ID":"f1909d9f-c6e3-4c55-93f5-be679e3c3792","Type":"ContainerStarted","Data":"99bc328b8c879729a985e754e39d21782d8d0bb08c6e985e3343db8acd24e663"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.020350 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" event={"ID":"1a6180f0-55bd-4c7e-a96c-97762cace534","Type":"ContainerStarted","Data":"b02c614541db9928015f90d09b291442a6c18fbe6ce572a1323d4bc3e29b17fe"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.025718 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" event={"ID":"a9f75d3c-e107-48aa-b15b-442b785b8945","Type":"ContainerStarted","Data":"26d986da2fd42e80a2f31f0f266c180f243952204b93c8a2b0577ae98ce5e060"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.028653 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" event={"ID":"8313eb28-2711-404c-817c-b782ea1cf41a","Type":"ContainerStarted","Data":"dd0037d64aff3617abb59ae841ecac458dddd8e0f62f16072c752792f5c0bb07"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.031100 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" event={"ID":"67fd364b-d05e-4d57-a817-3f64be5cdba0","Type":"ContainerStarted","Data":"7202a8ef778c2aa5a52fe1f7bfd91d24f8eb0fe430f33b71021372672ed3807d"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.032275 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" event={"ID":"a3c08e43-cc8b-433e-ba8e-fd225eef09ed","Type":"ContainerStarted","Data":"2430c9180522756d731b91fcad186720ef7e7ff7db8dc260c56d10b473c59978"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.033064 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" event={"ID":"5348e551-de55-4c32-af1e-ac9facc061d9","Type":"ContainerStarted","Data":"f1e3add5da9cf023178ebd1478452f93ea17b438f34f19ee4d4204b0723ca549"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.034091 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" event={"ID":"676e4e26-21ec-4b2c-ab3f-bc593cddfb33","Type":"ContainerStarted","Data":"86a97eb9a86d5846598834c24bf3bd8fc45a9bd899b0ad9c51815644c62ce627"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.036065 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" event={"ID":"d454e0e1-1745-4fc0-aea1-9d231de7fa65","Type":"ContainerStarted","Data":"81f25ed851b52922c731f0348ece077fc4d39f6e9fb8aa3adc512fc8b8ac1d6b"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.042205 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" event={"ID":"65d851bd-9407-48da-bac4-d3b07bab1d46","Type":"ContainerStarted","Data":"3b132f03b320d5bd0966fd739fbc0e2551af30f601fa79b3dd829f73059eeaf2"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.044186 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" event={"ID":"27b8cb71-5bd2-4133-bf5a-db571521861b","Type":"ContainerStarted","Data":"d25054f9d1488bee4e5585cb2774e038d2b18bbdf8981cfa9b454be9e062c02e"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.047608 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" event={"ID":"254d742d-881a-4ea9-97fd-2246d7109a77","Type":"ContainerStarted","Data":"49cbcdc10acf38efeedc51d23ca6b03c4720e447a3e217853e07985ab0e646b4"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.049858 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" event={"ID":"1519e3af-34c9-4722-9aaa-8a10ef0d49de","Type":"ContainerStarted","Data":"dce68e07fd2e1fa6c8e44c689cd75e84145be5616002cf39e3b202d371a7b962"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.051763 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" event={"ID":"7e7f599f-1cc9-41fc-b683-8b0de6e48761","Type":"ContainerStarted","Data":"f83ef56305d99974c7a65cc2ff2d77c11dd9158c960555c6f6e79dd1d957f9f5"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.054675 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" event={"ID":"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f","Type":"ContainerStarted","Data":"c73f83c1c9abdc155c572a060fdca107e9a0b37aafbf3dc3f41d0b3ea4a6e35d"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.058442 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" event={"ID":"f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1","Type":"ContainerStarted","Data":"f15e0ce50c0f39a5293b4d61d8878847745a36683d3e9a5654c5eaa789fcc4a0"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.060419 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" event={"ID":"8ae60958-f755-47fd-891b-74356bff787c","Type":"ContainerStarted","Data":"6a5ebfec4c21f76451f841887461b6f67168588200c68dfe9b9951ceb2477cb3"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.061617 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" event={"ID":"93f4faa8-4d5e-48d9-ac5a-bb1468f972d3","Type":"ContainerStarted","Data":"806df035ecc0cd8225d30d6860b7e46d7036502d3a6a1aaa6bb3ae5a7b169d7b"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.063280 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g" event={"ID":"00c4b19b-1c03-4fc2-9ac1-39ca45ca9570","Type":"ContainerStarted","Data":"f6ba8319dbf686985a82d7f6d036025ef647ec3f2cbdbeba1e14f6f5d675cec3"} Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.064408 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" event={"ID":"594a8f26-8acc-44a8-b024-665012e570f6","Type":"ContainerStarted","Data":"777ed7ac969ccd287b3874043166c0a6351bd0bbb3d26e2417fda4f820e76b7a"} Oct 09 10:42:21 crc kubenswrapper[4740]: E1009 10:42:21.067199 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" podUID="d454e0e1-1745-4fc0-aea1-9d231de7fa65" Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.132864 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:21 crc kubenswrapper[4740]: E1009 10:42:21.227550 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" podUID="93f4faa8-4d5e-48d9-ac5a-bb1468f972d3" Oct 09 10:42:21 crc kubenswrapper[4740]: E1009 10:42:21.247164 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" podUID="7e7f599f-1cc9-41fc-b683-8b0de6e48761" Oct 09 10:42:21 crc kubenswrapper[4740]: E1009 10:42:21.282307 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" podUID="1a6180f0-55bd-4c7e-a96c-97762cace534" Oct 09 10:42:21 crc kubenswrapper[4740]: E1009 10:42:21.310362 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" podUID="254d742d-881a-4ea9-97fd-2246d7109a77" Oct 09 10:42:21 crc kubenswrapper[4740]: E1009 10:42:21.324319 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" podUID="676e4e26-21ec-4b2c-ab3f-bc593cddfb33" Oct 09 10:42:21 crc kubenswrapper[4740]: I1009 10:42:21.520817 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx"] Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.141802 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" event={"ID":"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f","Type":"ContainerStarted","Data":"469dda660caacd3e8fd5798d4e0be858fadae60c1d7ca00f5152d626c2a52f12"} Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.142194 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.142210 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" event={"ID":"cd1eb7dc-bc88-4a8e-b681-751ebdf2089f","Type":"ContainerStarted","Data":"90154f1db4a0cecda1aec2b72f463944cd572567c819fe8fcd9fc1445fbad194"} Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.145585 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" event={"ID":"676e4e26-21ec-4b2c-ab3f-bc593cddfb33","Type":"ContainerStarted","Data":"d201cf7e012a5c7f06f5d7501a7c432e99e67588fe3c9baca2271d850f4ae8f3"} Oct 09 10:42:22 crc kubenswrapper[4740]: E1009 10:42:22.157700 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" podUID="676e4e26-21ec-4b2c-ab3f-bc593cddfb33" Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.158885 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" event={"ID":"1a6180f0-55bd-4c7e-a96c-97762cace534","Type":"ContainerStarted","Data":"7fbff4d561592de0c18e8be37ee10bb3a558303a262861de818df5b126cd81ad"} Oct 09 10:42:22 crc kubenswrapper[4740]: E1009 10:42:22.160649 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" podUID="1a6180f0-55bd-4c7e-a96c-97762cace534" Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.161895 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" event={"ID":"d454e0e1-1745-4fc0-aea1-9d231de7fa65","Type":"ContainerStarted","Data":"5a038b74afc58131258cc27297befbed84ed740979df5de37863a521e5a3d5e4"} Oct 09 10:42:22 crc kubenswrapper[4740]: E1009 10:42:22.163523 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" podUID="d454e0e1-1745-4fc0-aea1-9d231de7fa65" Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.191576 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" event={"ID":"93f4faa8-4d5e-48d9-ac5a-bb1468f972d3","Type":"ContainerStarted","Data":"bc5f1ade522d6eb8830dc45c1cd0bf60ddd14b64b0f767f36dc1fa48f1d1ac1b"} Oct 09 10:42:22 crc kubenswrapper[4740]: E1009 10:42:22.193788 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:783f711b4cb179819cfcb81167c3591c70671440f4551bbe48b7a8730567f577\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" podUID="93f4faa8-4d5e-48d9-ac5a-bb1468f972d3" Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.202288 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" event={"ID":"254d742d-881a-4ea9-97fd-2246d7109a77","Type":"ContainerStarted","Data":"d99ae88623658d0b4953c449e7e6ff25a11a57821ee6b62d990b9f868763f501"} Oct 09 10:42:22 crc kubenswrapper[4740]: E1009 10:42:22.224421 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" podUID="254d742d-881a-4ea9-97fd-2246d7109a77" Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.260732 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" event={"ID":"3179f3c7-2f14-494b-9fea-3c217a11af2b","Type":"ContainerStarted","Data":"42047f396baf808d667ab63119e4e95b37d093af925a92acb0e37f29977642d5"} Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.262742 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" event={"ID":"7e7f599f-1cc9-41fc-b683-8b0de6e48761","Type":"ContainerStarted","Data":"59f56e11ec4c95624d0bfb9f41793edb3ba305e692c815330752d185dd175de2"} Oct 09 10:42:22 crc kubenswrapper[4740]: E1009 10:42:22.295464 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" podUID="7e7f599f-1cc9-41fc-b683-8b0de6e48761" Oct 09 10:42:22 crc kubenswrapper[4740]: I1009 10:42:22.475490 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" podStartSLOduration=3.47547024 podStartE2EDuration="3.47547024s" podCreationTimestamp="2025-10-09 10:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:42:22.4733175 +0000 UTC m=+881.435517881" watchObservedRunningTime="2025-10-09 10:42:22.47547024 +0000 UTC m=+881.437670621" Oct 09 10:42:23 crc kubenswrapper[4740]: E1009 10:42:23.272305 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:783f711b4cb179819cfcb81167c3591c70671440f4551bbe48b7a8730567f577\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" podUID="93f4faa8-4d5e-48d9-ac5a-bb1468f972d3" Oct 09 10:42:23 crc kubenswrapper[4740]: E1009 10:42:23.272509 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" podUID="254d742d-881a-4ea9-97fd-2246d7109a77" Oct 09 10:42:23 crc kubenswrapper[4740]: E1009 10:42:23.273923 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" podUID="676e4e26-21ec-4b2c-ab3f-bc593cddfb33" Oct 09 10:42:23 crc kubenswrapper[4740]: E1009 10:42:23.273952 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" podUID="d454e0e1-1745-4fc0-aea1-9d231de7fa65" Oct 09 10:42:23 crc kubenswrapper[4740]: E1009 10:42:23.273984 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" podUID="7e7f599f-1cc9-41fc-b683-8b0de6e48761" Oct 09 10:42:23 crc kubenswrapper[4740]: E1009 10:42:23.273978 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" podUID="1a6180f0-55bd-4c7e-a96c-97762cace534" Oct 09 10:42:29 crc kubenswrapper[4740]: I1009 10:42:29.977535 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5647484f69-cxbqt" Oct 09 10:42:31 crc kubenswrapper[4740]: I1009 10:42:31.334889 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" event={"ID":"a3c08e43-cc8b-433e-ba8e-fd225eef09ed","Type":"ContainerStarted","Data":"117077609c552ce0f720875247b6f1e88fe2787b768b39a668620cbd65d3d490"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.372308 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" event={"ID":"f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1","Type":"ContainerStarted","Data":"e3d249cbccaf5decb87600f46400de5df822d657537f0bf609721eaf6eeaaa3a"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.385405 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" event={"ID":"594a8f26-8acc-44a8-b024-665012e570f6","Type":"ContainerStarted","Data":"7b7e9ebaeb19e2aa6e8f34bb0810ff59d123e89d09a26299fa3392428b3bfd42"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.403054 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" event={"ID":"65d851bd-9407-48da-bac4-d3b07bab1d46","Type":"ContainerStarted","Data":"880a75af91b7d250dc2651ea9f6e322db686a3f8d4820af713c35b5ee7e23507"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.423331 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" event={"ID":"3179f3c7-2f14-494b-9fea-3c217a11af2b","Type":"ContainerStarted","Data":"67d90b67c3c50701a036785c8e393e4eff5efe61fba02b710e38aceb40529265"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.437007 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" event={"ID":"5348e551-de55-4c32-af1e-ac9facc061d9","Type":"ContainerStarted","Data":"ffbcf62b2c7da050d436c810a36a4e2e8cae55d9ad7e906826e759dbe453c365"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.457341 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g" event={"ID":"00c4b19b-1c03-4fc2-9ac1-39ca45ca9570","Type":"ContainerStarted","Data":"05b6829a06d62cd86d27d2d508be3e8559a1968e875087a91394e2eb1e35acfd"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.471042 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" event={"ID":"2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8","Type":"ContainerStarted","Data":"c741fa43a6a3ae8a7607729299919c64c11348a68ad2efbdb27f066737fcecd4"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.471084 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" event={"ID":"2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8","Type":"ContainerStarted","Data":"0dc90490e7fa522cd560bc88f7b92ac30860e715394db413d91a3deb5f9dbf3d"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.471591 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.481509 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" event={"ID":"67fd364b-d05e-4d57-a817-3f64be5cdba0","Type":"ContainerStarted","Data":"e2698ea69b9d376f3a8a161370b19aa51f2c1004ad93105a346ab500dde1d28d"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.481558 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" event={"ID":"67fd364b-d05e-4d57-a817-3f64be5cdba0","Type":"ContainerStarted","Data":"4407e9228e156559bc13e05d3d1501f74af0c70d3128726fc0f914c2bac8b339"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.482171 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.495290 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" event={"ID":"a3c08e43-cc8b-433e-ba8e-fd225eef09ed","Type":"ContainerStarted","Data":"6a74038956ac1d000a667e12b7da01715b2955944f55e590a7cfac2bf3c3a815"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.495575 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.503279 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" event={"ID":"f1909d9f-c6e3-4c55-93f5-be679e3c3792","Type":"ContainerStarted","Data":"05a17ebd085fe30a32289c2ecb7aebbcbcd6c404d3e433e052c38155b924353c"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.513290 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g" podStartSLOduration=3.26771545 podStartE2EDuration="13.513272082s" podCreationTimestamp="2025-10-09 10:42:19 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.738679039 +0000 UTC m=+879.700879420" lastFinishedPulling="2025-10-09 10:42:30.984235671 +0000 UTC m=+889.946436052" observedRunningTime="2025-10-09 10:42:32.481921291 +0000 UTC m=+891.444121672" watchObservedRunningTime="2025-10-09 10:42:32.513272082 +0000 UTC m=+891.475472463" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.513844 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" podStartSLOduration=3.254477303 podStartE2EDuration="14.513839928s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:19.804135027 +0000 UTC m=+878.766335408" lastFinishedPulling="2025-10-09 10:42:31.063497652 +0000 UTC m=+890.025698033" observedRunningTime="2025-10-09 10:42:32.505232129 +0000 UTC m=+891.467432500" watchObservedRunningTime="2025-10-09 10:42:32.513839928 +0000 UTC m=+891.476040309" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.514950 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" event={"ID":"95b86671-972c-4a57-b68b-0421b82bd3d4","Type":"ContainerStarted","Data":"65ad402cc6f05563bcf6ec571da5d29d85f555ac8e8f3405fd5fc1fb98b61a90"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.530309 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" event={"ID":"8ae60958-f755-47fd-891b-74356bff787c","Type":"ContainerStarted","Data":"c9e0084ba036c6cf73c0a48ea98a2eee83d7060875bde36a3e384cbe6e501852"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.530440 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.539283 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" podStartSLOduration=3.885950219 podStartE2EDuration="14.539270034s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.360110716 +0000 UTC m=+879.322311097" lastFinishedPulling="2025-10-09 10:42:31.013430531 +0000 UTC m=+889.975630912" observedRunningTime="2025-10-09 10:42:32.538285806 +0000 UTC m=+891.500486187" watchObservedRunningTime="2025-10-09 10:42:32.539270034 +0000 UTC m=+891.501470405" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.552317 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" event={"ID":"a9f75d3c-e107-48aa-b15b-442b785b8945","Type":"ContainerStarted","Data":"ed0146b1ab1cd122d8308b14099af2e41081cb2d9b079d867e7f9982fe88c779"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.554101 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.562100 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" event={"ID":"27b8cb71-5bd2-4133-bf5a-db571521861b","Type":"ContainerStarted","Data":"72ff7f755519d0c9bbce38b14997accf63d3b4dd6946dbc931f92f5e34cefd1c"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.562147 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" event={"ID":"27b8cb71-5bd2-4133-bf5a-db571521861b","Type":"ContainerStarted","Data":"7ffe2941988e5936b35d7c4218206b14f5067081015d6615bceecfc6e652b493"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.562461 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.571006 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" podStartSLOduration=3.636177473 podStartE2EDuration="14.570988715s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.045618663 +0000 UTC m=+879.007819044" lastFinishedPulling="2025-10-09 10:42:30.980429865 +0000 UTC m=+889.942630286" observedRunningTime="2025-10-09 10:42:32.563846036 +0000 UTC m=+891.526046417" watchObservedRunningTime="2025-10-09 10:42:32.570988715 +0000 UTC m=+891.533189096" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.573908 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" event={"ID":"8313eb28-2711-404c-817c-b782ea1cf41a","Type":"ContainerStarted","Data":"034d652b09b79ed6fa3f5393beea9adc944c6c92cd9e91d975fdc8b090c7a1b2"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.575296 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" event={"ID":"1519e3af-34c9-4722-9aaa-8a10ef0d49de","Type":"ContainerStarted","Data":"89cdc1bd7237b3f0612daabedf12ad82e2625369c10d0214d411ee5aa5a6cf26"} Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.600329 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" podStartSLOduration=4.074572787 podStartE2EDuration="14.600308599s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.494233671 +0000 UTC m=+879.456434052" lastFinishedPulling="2025-10-09 10:42:31.019969483 +0000 UTC m=+889.982169864" observedRunningTime="2025-10-09 10:42:32.588984274 +0000 UTC m=+891.551184645" watchObservedRunningTime="2025-10-09 10:42:32.600308599 +0000 UTC m=+891.562508980" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.621177 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" podStartSLOduration=4.423874407 podStartE2EDuration="14.621163788s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.809187347 +0000 UTC m=+879.771387728" lastFinishedPulling="2025-10-09 10:42:31.006476728 +0000 UTC m=+889.968677109" observedRunningTime="2025-10-09 10:42:32.620685235 +0000 UTC m=+891.582885616" watchObservedRunningTime="2025-10-09 10:42:32.621163788 +0000 UTC m=+891.583364169" Oct 09 10:42:32 crc kubenswrapper[4740]: I1009 10:42:32.642126 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" podStartSLOduration=4.120331628 podStartE2EDuration="14.64210497s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.492946235 +0000 UTC m=+879.455146616" lastFinishedPulling="2025-10-09 10:42:31.014719537 +0000 UTC m=+889.976919958" observedRunningTime="2025-10-09 10:42:32.642037938 +0000 UTC m=+891.604238319" watchObservedRunningTime="2025-10-09 10:42:32.64210497 +0000 UTC m=+891.604305351" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.585655 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" event={"ID":"594a8f26-8acc-44a8-b024-665012e570f6","Type":"ContainerStarted","Data":"ab5088c547e35cec232da277411492bbb60b92119069197fa0619841ab954bc0"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.587290 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.588403 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" event={"ID":"65d851bd-9407-48da-bac4-d3b07bab1d46","Type":"ContainerStarted","Data":"e6c8a7af0a029ac156acffff85ae686459430f2e61cbd261a716e33d1d5d60e3"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.590467 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" event={"ID":"a9f75d3c-e107-48aa-b15b-442b785b8945","Type":"ContainerStarted","Data":"6fea6d8b9273135ff5a6fb1f6a89bef8972a0568a1ea5465a87a9b1fd245d95b"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.593150 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" event={"ID":"8313eb28-2711-404c-817c-b782ea1cf41a","Type":"ContainerStarted","Data":"db78b1dda91aa0a96182ac8f46d1eb25eb7f81619acc11d6a3edb42ff294747b"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.593301 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.594980 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" event={"ID":"8ae60958-f755-47fd-891b-74356bff787c","Type":"ContainerStarted","Data":"cb1ed136fd3d5224e0a30fa64a10ad5bd4258fbc43711836e81aae9b9e5576ba"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.596776 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" event={"ID":"1519e3af-34c9-4722-9aaa-8a10ef0d49de","Type":"ContainerStarted","Data":"8493adb01590fec358757ff1d9898e4fbd5608f37dddeaa7cf592585700a1818"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.596863 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.598455 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" event={"ID":"95b86671-972c-4a57-b68b-0421b82bd3d4","Type":"ContainerStarted","Data":"e52c6284ed3509820c4674a95ab55a0025a9c09994de72ff9ad7664fd1febfe4"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.598553 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.600285 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" event={"ID":"f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1","Type":"ContainerStarted","Data":"c59d907745635e6f3729655243d1162ca3914aa39ccd9a968217ae64c37df089"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.600481 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.603037 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" event={"ID":"3179f3c7-2f14-494b-9fea-3c217a11af2b","Type":"ContainerStarted","Data":"f41dd36d72b01c493054463e05fc3438d2f6e4735c69a11092604e4b1ebfe6c6"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.603088 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.608163 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" event={"ID":"5348e551-de55-4c32-af1e-ac9facc061d9","Type":"ContainerStarted","Data":"d13fb7afcf09d95fc04f0f5f38803b8e05d49542f950854984457099bc63d091"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.611904 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" event={"ID":"f1909d9f-c6e3-4c55-93f5-be679e3c3792","Type":"ContainerStarted","Data":"3d1aa84df551878a9b421a8cc0379db6cf188ab7a56f8893a98ebd545b83ee1c"} Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.615961 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" podStartSLOduration=4.164298919 podStartE2EDuration="14.615946334s" podCreationTimestamp="2025-10-09 10:42:19 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.568236746 +0000 UTC m=+879.530437127" lastFinishedPulling="2025-10-09 10:42:31.019884161 +0000 UTC m=+889.982084542" observedRunningTime="2025-10-09 10:42:33.610172353 +0000 UTC m=+892.572372724" watchObservedRunningTime="2025-10-09 10:42:33.615946334 +0000 UTC m=+892.578146725" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.659236 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" podStartSLOduration=6.199711802 podStartE2EDuration="15.659215625s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:21.610925931 +0000 UTC m=+880.573126302" lastFinishedPulling="2025-10-09 10:42:31.070429744 +0000 UTC m=+890.032630125" observedRunningTime="2025-10-09 10:42:33.653466966 +0000 UTC m=+892.615667367" watchObservedRunningTime="2025-10-09 10:42:33.659215625 +0000 UTC m=+892.621416026" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.679670 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" podStartSLOduration=4.619354786 podStartE2EDuration="15.679646293s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:19.95442447 +0000 UTC m=+878.916624851" lastFinishedPulling="2025-10-09 10:42:31.014715977 +0000 UTC m=+889.976916358" observedRunningTime="2025-10-09 10:42:33.676854765 +0000 UTC m=+892.639055156" watchObservedRunningTime="2025-10-09 10:42:33.679646293 +0000 UTC m=+892.641846674" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.699488 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" podStartSLOduration=4.995537352 podStartE2EDuration="15.699461873s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.335946345 +0000 UTC m=+879.298146726" lastFinishedPulling="2025-10-09 10:42:31.039870856 +0000 UTC m=+890.002071247" observedRunningTime="2025-10-09 10:42:33.698214928 +0000 UTC m=+892.660415319" watchObservedRunningTime="2025-10-09 10:42:33.699461873 +0000 UTC m=+892.661662264" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.717233 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" podStartSLOduration=5.053291536 podStartE2EDuration="15.717209936s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.356399023 +0000 UTC m=+879.318599404" lastFinishedPulling="2025-10-09 10:42:31.020317413 +0000 UTC m=+889.982517804" observedRunningTime="2025-10-09 10:42:33.714682746 +0000 UTC m=+892.676883137" watchObservedRunningTime="2025-10-09 10:42:33.717209936 +0000 UTC m=+892.679410327" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.734835 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" podStartSLOduration=5.105358312 podStartE2EDuration="15.734815825s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.357704929 +0000 UTC m=+879.319905310" lastFinishedPulling="2025-10-09 10:42:30.987162442 +0000 UTC m=+889.949362823" observedRunningTime="2025-10-09 10:42:33.732709866 +0000 UTC m=+892.694910257" watchObservedRunningTime="2025-10-09 10:42:33.734815825 +0000 UTC m=+892.697016216" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.753630 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" podStartSLOduration=5.223710409 podStartE2EDuration="15.753613637s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.485138928 +0000 UTC m=+879.447339309" lastFinishedPulling="2025-10-09 10:42:31.015042156 +0000 UTC m=+889.977242537" observedRunningTime="2025-10-09 10:42:33.748527315 +0000 UTC m=+892.710727696" watchObservedRunningTime="2025-10-09 10:42:33.753613637 +0000 UTC m=+892.715814018" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.772943 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" podStartSLOduration=4.53020319 podStartE2EDuration="14.772923173s" podCreationTimestamp="2025-10-09 10:42:19 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.772020415 +0000 UTC m=+879.734220796" lastFinishedPulling="2025-10-09 10:42:31.014740398 +0000 UTC m=+889.976940779" observedRunningTime="2025-10-09 10:42:33.770694971 +0000 UTC m=+892.732895362" watchObservedRunningTime="2025-10-09 10:42:33.772923173 +0000 UTC m=+892.735123554" Oct 09 10:42:33 crc kubenswrapper[4740]: I1009 10:42:33.789240 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" podStartSLOduration=4.330840034 podStartE2EDuration="14.789222726s" podCreationTimestamp="2025-10-09 10:42:19 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.563447733 +0000 UTC m=+879.525648114" lastFinishedPulling="2025-10-09 10:42:31.021830425 +0000 UTC m=+889.984030806" observedRunningTime="2025-10-09 10:42:33.788535727 +0000 UTC m=+892.750736118" watchObservedRunningTime="2025-10-09 10:42:33.789222726 +0000 UTC m=+892.751423117" Oct 09 10:42:34 crc kubenswrapper[4740]: I1009 10:42:34.622555 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" Oct 09 10:42:34 crc kubenswrapper[4740]: I1009 10:42:34.623309 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" Oct 09 10:42:34 crc kubenswrapper[4740]: I1009 10:42:34.623381 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" Oct 09 10:42:35 crc kubenswrapper[4740]: I1009 10:42:35.407802 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:42:35 crc kubenswrapper[4740]: I1009 10:42:35.407875 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:42:37 crc kubenswrapper[4740]: I1009 10:42:37.649272 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" event={"ID":"254d742d-881a-4ea9-97fd-2246d7109a77","Type":"ContainerStarted","Data":"b488c8327b7a411c00d3e2757723e74a43ad05b0fdc2eae309de09234d7560d2"} Oct 09 10:42:37 crc kubenswrapper[4740]: I1009 10:42:37.649805 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" Oct 09 10:42:37 crc kubenswrapper[4740]: I1009 10:42:37.675051 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" podStartSLOduration=3.885917709 podStartE2EDuration="19.674996995s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.83196548 +0000 UTC m=+879.794165871" lastFinishedPulling="2025-10-09 10:42:36.621044776 +0000 UTC m=+895.583245157" observedRunningTime="2025-10-09 10:42:37.669493632 +0000 UTC m=+896.631694013" watchObservedRunningTime="2025-10-09 10:42:37.674996995 +0000 UTC m=+896.637197866" Oct 09 10:42:38 crc kubenswrapper[4740]: I1009 10:42:38.656149 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" event={"ID":"1a6180f0-55bd-4c7e-a96c-97762cace534","Type":"ContainerStarted","Data":"23ec8cd176eb024848a5d41b9de7f1137a410b5fb2abadbc5502240405b94e5f"} Oct 09 10:42:38 crc kubenswrapper[4740]: I1009 10:42:38.657134 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" Oct 09 10:42:38 crc kubenswrapper[4740]: I1009 10:42:38.674044 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" podStartSLOduration=2.711240168 podStartE2EDuration="19.674022778s" podCreationTimestamp="2025-10-09 10:42:19 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.831990921 +0000 UTC m=+879.794191302" lastFinishedPulling="2025-10-09 10:42:37.794773531 +0000 UTC m=+896.756973912" observedRunningTime="2025-10-09 10:42:38.669005889 +0000 UTC m=+897.631206270" watchObservedRunningTime="2025-10-09 10:42:38.674022778 +0000 UTC m=+897.636223159" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.109326 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-mp2p4" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.134969 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-w2ftw" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.228469 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-prf5f" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.340200 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hsl94" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.369550 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6jtst" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.409439 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-cz2dz" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.443900 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sf7cf" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.458547 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-268g9" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.493374 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-p4btw" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.605349 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-8nkmb" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.697315 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-q5tlk" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.793288 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-6fng6" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.912154 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-4zsvx" Oct 09 10:42:39 crc kubenswrapper[4740]: I1009 10:42:39.930513 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ngw9n" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.140939 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.675942 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" event={"ID":"676e4e26-21ec-4b2c-ab3f-bc593cddfb33","Type":"ContainerStarted","Data":"0f4320923db32bfe64003473be70182c3d8e102952ed1c3d14335bd8c9634c70"} Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.677018 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.678694 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" event={"ID":"d454e0e1-1745-4fc0-aea1-9d231de7fa65","Type":"ContainerStarted","Data":"cc98baa9ae12c02b6fccfe064aa1dcdf571c78a76c43e054ab476bd1e10e5d89"} Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.679377 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.681659 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" event={"ID":"93f4faa8-4d5e-48d9-ac5a-bb1468f972d3","Type":"ContainerStarted","Data":"ab306f44675e9998da99d5d0da39203d7c8c90bcdd623392f509ea9400eb2ca1"} Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.682099 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.683862 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" event={"ID":"7e7f599f-1cc9-41fc-b683-8b0de6e48761","Type":"ContainerStarted","Data":"48edec425705314d40247fb71222a9477b89f7aaf28526a8aa7aa068ac44b263"} Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.684018 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.719025 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" podStartSLOduration=3.897513551 podStartE2EDuration="23.719007228s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.831932309 +0000 UTC m=+879.794132690" lastFinishedPulling="2025-10-09 10:42:40.653425966 +0000 UTC m=+899.615626367" observedRunningTime="2025-10-09 10:42:41.715544662 +0000 UTC m=+900.677745043" watchObservedRunningTime="2025-10-09 10:42:41.719007228 +0000 UTC m=+900.681207609" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.721016 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" podStartSLOduration=3.873293038 podStartE2EDuration="23.721005393s" podCreationTimestamp="2025-10-09 10:42:18 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.816550052 +0000 UTC m=+879.778750433" lastFinishedPulling="2025-10-09 10:42:40.664262397 +0000 UTC m=+899.626462788" observedRunningTime="2025-10-09 10:42:41.697003407 +0000 UTC m=+900.659203788" watchObservedRunningTime="2025-10-09 10:42:41.721005393 +0000 UTC m=+900.683205774" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.736204 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" podStartSLOduration=2.309409199 podStartE2EDuration="22.736185395s" podCreationTimestamp="2025-10-09 10:42:19 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.800176597 +0000 UTC m=+879.762376978" lastFinishedPulling="2025-10-09 10:42:41.226952793 +0000 UTC m=+900.189153174" observedRunningTime="2025-10-09 10:42:41.730448926 +0000 UTC m=+900.692649307" watchObservedRunningTime="2025-10-09 10:42:41.736185395 +0000 UTC m=+900.698385776" Oct 09 10:42:41 crc kubenswrapper[4740]: I1009 10:42:41.746656 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" podStartSLOduration=2.915322735 podStartE2EDuration="22.746640215s" podCreationTimestamp="2025-10-09 10:42:19 +0000 UTC" firstStartedPulling="2025-10-09 10:42:20.817249411 +0000 UTC m=+879.779449792" lastFinishedPulling="2025-10-09 10:42:40.648566891 +0000 UTC m=+899.610767272" observedRunningTime="2025-10-09 10:42:41.744382323 +0000 UTC m=+900.706582714" watchObservedRunningTime="2025-10-09 10:42:41.746640215 +0000 UTC m=+900.708840596" Oct 09 10:42:49 crc kubenswrapper[4740]: I1009 10:42:49.453629 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-h4lw2" Oct 09 10:42:49 crc kubenswrapper[4740]: I1009 10:42:49.578514 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-kv5jg" Oct 09 10:42:49 crc kubenswrapper[4740]: I1009 10:42:49.660396 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-r494n" Oct 09 10:42:49 crc kubenswrapper[4740]: I1009 10:42:49.729633 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-gg5c4" Oct 09 10:42:49 crc kubenswrapper[4740]: I1009 10:42:49.850071 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mxz8d" Oct 09 10:42:50 crc kubenswrapper[4740]: I1009 10:42:50.091499 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jqnnv" Oct 09 10:43:05 crc kubenswrapper[4740]: I1009 10:43:05.408255 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:43:05 crc kubenswrapper[4740]: I1009 10:43:05.408874 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.532436 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n4bxk"] Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.535831 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.542527 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.542592 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n4bxk"] Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.542543 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.547248 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-m7g9p" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.547642 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.619614 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-szmdz"] Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.621027 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.625067 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.634000 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-szmdz"] Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.644723 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2z4c\" (UniqueName: \"kubernetes.io/projected/e780ff7b-74c6-41aa-8a91-b209afe2f69c-kube-api-access-t2z4c\") pod \"dnsmasq-dns-675f4bcbfc-n4bxk\" (UID: \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.644843 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e780ff7b-74c6-41aa-8a91-b209afe2f69c-config\") pod \"dnsmasq-dns-675f4bcbfc-n4bxk\" (UID: \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.746548 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-config\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.746602 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.746655 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9grq\" (UniqueName: \"kubernetes.io/projected/9c05fcab-7edd-46c8-883c-c6dd58821780-kube-api-access-q9grq\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.746713 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2z4c\" (UniqueName: \"kubernetes.io/projected/e780ff7b-74c6-41aa-8a91-b209afe2f69c-kube-api-access-t2z4c\") pod \"dnsmasq-dns-675f4bcbfc-n4bxk\" (UID: \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.746791 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e780ff7b-74c6-41aa-8a91-b209afe2f69c-config\") pod \"dnsmasq-dns-675f4bcbfc-n4bxk\" (UID: \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.747855 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e780ff7b-74c6-41aa-8a91-b209afe2f69c-config\") pod \"dnsmasq-dns-675f4bcbfc-n4bxk\" (UID: \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.771995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2z4c\" (UniqueName: \"kubernetes.io/projected/e780ff7b-74c6-41aa-8a91-b209afe2f69c-kube-api-access-t2z4c\") pod \"dnsmasq-dns-675f4bcbfc-n4bxk\" (UID: \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.848053 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-config\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.848101 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.848140 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9grq\" (UniqueName: \"kubernetes.io/projected/9c05fcab-7edd-46c8-883c-c6dd58821780-kube-api-access-q9grq\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.849291 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-config\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.849929 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.858767 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.869015 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9grq\" (UniqueName: \"kubernetes.io/projected/9c05fcab-7edd-46c8-883c-c6dd58821780-kube-api-access-q9grq\") pod \"dnsmasq-dns-78dd6ddcc-szmdz\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:07 crc kubenswrapper[4740]: I1009 10:43:07.977139 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:08 crc kubenswrapper[4740]: I1009 10:43:08.286868 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n4bxk"] Oct 09 10:43:08 crc kubenswrapper[4740]: I1009 10:43:08.395303 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-szmdz"] Oct 09 10:43:08 crc kubenswrapper[4740]: W1009 10:43:08.401805 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c05fcab_7edd_46c8_883c_c6dd58821780.slice/crio-6a8c87c4a02dd95f6066f18251669a92efac031bc879febeebafc25480c8d9e0 WatchSource:0}: Error finding container 6a8c87c4a02dd95f6066f18251669a92efac031bc879febeebafc25480c8d9e0: Status 404 returned error can't find the container with id 6a8c87c4a02dd95f6066f18251669a92efac031bc879febeebafc25480c8d9e0 Oct 09 10:43:08 crc kubenswrapper[4740]: I1009 10:43:08.909785 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" event={"ID":"9c05fcab-7edd-46c8-883c-c6dd58821780","Type":"ContainerStarted","Data":"6a8c87c4a02dd95f6066f18251669a92efac031bc879febeebafc25480c8d9e0"} Oct 09 10:43:08 crc kubenswrapper[4740]: I1009 10:43:08.910880 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" event={"ID":"e780ff7b-74c6-41aa-8a91-b209afe2f69c","Type":"ContainerStarted","Data":"974007d6d99cc7637249dfe12dafe26fb77043c17435b3fa77594d10557f404c"} Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.756872 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n4bxk"] Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.785781 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tv5zx"] Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.787204 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.802432 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tv5zx"] Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.893522 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.893647 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htnlc\" (UniqueName: \"kubernetes.io/projected/b7517706-a284-4265-8deb-cbb523873afd-kube-api-access-htnlc\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.893708 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-config\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.998024 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htnlc\" (UniqueName: \"kubernetes.io/projected/b7517706-a284-4265-8deb-cbb523873afd-kube-api-access-htnlc\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.998110 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-config\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.998157 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:10 crc kubenswrapper[4740]: I1009 10:43:10.999306 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.000304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-config\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.036879 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htnlc\" (UniqueName: \"kubernetes.io/projected/b7517706-a284-4265-8deb-cbb523873afd-kube-api-access-htnlc\") pod \"dnsmasq-dns-666b6646f7-tv5zx\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.066648 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-szmdz"] Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.093543 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xdk5b"] Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.094688 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.114151 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.152819 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xdk5b"] Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.202356 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-config\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.202552 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xdrv\" (UniqueName: \"kubernetes.io/projected/18483b6e-9941-4d1c-af6e-2812758e0265-kube-api-access-6xdrv\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.202673 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.304094 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xdrv\" (UniqueName: \"kubernetes.io/projected/18483b6e-9941-4d1c-af6e-2812758e0265-kube-api-access-6xdrv\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.304487 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.304563 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-config\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.305587 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-config\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.305676 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.320775 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xdrv\" (UniqueName: \"kubernetes.io/projected/18483b6e-9941-4d1c-af6e-2812758e0265-kube-api-access-6xdrv\") pod \"dnsmasq-dns-57d769cc4f-xdk5b\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.408010 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.936932 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.938344 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.941731 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.941995 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-k7bn4" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.943636 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.943677 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.943854 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.943899 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.948281 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 10:43:11 crc kubenswrapper[4740]: I1009 10:43:11.949580 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.014976 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx5x7\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-kube-api-access-wx5x7\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015034 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015063 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa98dfc6-da2e-42b0-a620-a07230e1833d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015092 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-config-data\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015181 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015218 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015243 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015271 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015299 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa98dfc6-da2e-42b0-a620-a07230e1833d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015318 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.015418 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.116966 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117057 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx5x7\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-kube-api-access-wx5x7\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117091 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117113 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa98dfc6-da2e-42b0-a620-a07230e1833d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117208 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-config-data\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117465 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117523 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117557 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117598 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117633 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa98dfc6-da2e-42b0-a620-a07230e1833d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117662 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117964 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.117986 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.118239 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.119885 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.119790 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-config-data\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.123741 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.123981 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa98dfc6-da2e-42b0-a620-a07230e1833d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.130924 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.131193 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa98dfc6-da2e-42b0-a620-a07230e1833d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.131636 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.141283 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx5x7\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-kube-api-access-wx5x7\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.144182 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.232205 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.233591 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.235434 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.235688 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.236665 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.236999 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.237259 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.237365 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.238154 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t9rb9" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.244703 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.271433 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325010 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325055 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325098 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325124 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/187134d2-2fe9-4beb-beff-6a48162a1933-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325148 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325176 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325205 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db275\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-kube-api-access-db275\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325238 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/187134d2-2fe9-4beb-beff-6a48162a1933-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325257 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325290 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.325384 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.426866 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.426920 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.426967 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.426986 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.427026 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.427049 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/187134d2-2fe9-4beb-beff-6a48162a1933-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.427071 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.427101 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.427133 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db275\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-kube-api-access-db275\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.427207 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/187134d2-2fe9-4beb-beff-6a48162a1933-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.427231 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.427644 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.428317 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.428356 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.428498 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.428992 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.432147 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.432337 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.432489 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/187134d2-2fe9-4beb-beff-6a48162a1933-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.433391 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.433981 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/187134d2-2fe9-4beb-beff-6a48162a1933-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.446672 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db275\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-kube-api-access-db275\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.449098 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:12 crc kubenswrapper[4740]: I1009 10:43:12.559581 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.074219 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.076550 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.083041 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.083275 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-jx4z9" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.083284 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.083399 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.083424 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.085797 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.088869 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.173717 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-kolla-config\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.173809 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwkhd\" (UniqueName: \"kubernetes.io/projected/3dce8908-af4b-4596-bed2-02788a615207-kube-api-access-lwkhd\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.173838 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3dce8908-af4b-4596-bed2-02788a615207-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.173883 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.174121 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-config-data-default\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.174197 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.174248 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.174281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-secrets\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.174309 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.199971 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.201385 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.203835 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.204283 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.204383 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-zzggj" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.205226 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.216773 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.275500 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-config-data-default\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.277721 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.277782 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-config-data-default\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.277799 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.277873 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.277921 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.277965 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.277993 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-secrets\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278045 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278088 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhlqg\" (UniqueName: \"kubernetes.io/projected/dcd9d52b-8167-47f9-8c36-b75f88119ad5-kube-api-access-bhlqg\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278128 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-kolla-config\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278148 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278188 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d52b-8167-47f9-8c36-b75f88119ad5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278206 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278222 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278257 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwkhd\" (UniqueName: \"kubernetes.io/projected/3dce8908-af4b-4596-bed2-02788a615207-kube-api-access-lwkhd\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278276 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3dce8908-af4b-4596-bed2-02788a615207-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.278335 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.279115 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3dce8908-af4b-4596-bed2-02788a615207-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.279220 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.280037 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-kolla-config\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.281849 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dce8908-af4b-4596-bed2-02788a615207-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.284409 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-secrets\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.285273 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.299337 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dce8908-af4b-4596-bed2-02788a615207-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.301647 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.304467 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwkhd\" (UniqueName: \"kubernetes.io/projected/3dce8908-af4b-4596-bed2-02788a615207-kube-api-access-lwkhd\") pod \"openstack-galera-0\" (UID: \"3dce8908-af4b-4596-bed2-02788a615207\") " pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379322 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379370 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379408 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379454 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhlqg\" (UniqueName: \"kubernetes.io/projected/dcd9d52b-8167-47f9-8c36-b75f88119ad5-kube-api-access-bhlqg\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379484 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379516 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d52b-8167-47f9-8c36-b75f88119ad5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379540 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379561 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.379602 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.380314 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.380453 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d52b-8167-47f9-8c36-b75f88119ad5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.380487 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.380877 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.381185 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd9d52b-8167-47f9-8c36-b75f88119ad5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.383288 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.383955 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.384812 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dcd9d52b-8167-47f9-8c36-b75f88119ad5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.396776 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhlqg\" (UniqueName: \"kubernetes.io/projected/dcd9d52b-8167-47f9-8c36-b75f88119ad5-kube-api-access-bhlqg\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.402336 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.436889 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dcd9d52b-8167-47f9-8c36-b75f88119ad5\") " pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.522009 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.615078 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.616061 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.619987 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.620495 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-n69xf" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.620598 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.627619 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.683884 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-config-data\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.683932 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-kolla-config\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.683968 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.684013 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxctk\" (UniqueName: \"kubernetes.io/projected/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-kube-api-access-sxctk\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.684045 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.786109 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.786191 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxctk\" (UniqueName: \"kubernetes.io/projected/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-kube-api-access-sxctk\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.786225 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.786273 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-config-data\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.786297 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-kolla-config\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.790087 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-config-data\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.790241 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-kolla-config\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.790406 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.792303 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.803140 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxctk\" (UniqueName: \"kubernetes.io/projected/f16fec59-73b1-4b57-ab47-c1767c6c2a7d-kube-api-access-sxctk\") pod \"memcached-0\" (UID: \"f16fec59-73b1-4b57-ab47-c1767c6c2a7d\") " pod="openstack/memcached-0" Oct 09 10:43:15 crc kubenswrapper[4740]: I1009 10:43:15.932426 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 10:43:17 crc kubenswrapper[4740]: I1009 10:43:17.795999 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:43:17 crc kubenswrapper[4740]: I1009 10:43:17.797636 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 10:43:17 crc kubenswrapper[4740]: I1009 10:43:17.799531 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9ghpl" Oct 09 10:43:17 crc kubenswrapper[4740]: I1009 10:43:17.811032 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:43:17 crc kubenswrapper[4740]: I1009 10:43:17.920538 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97zmg\" (UniqueName: \"kubernetes.io/projected/97026040-37a7-4aa6-aad1-a9b204d4d329-kube-api-access-97zmg\") pod \"kube-state-metrics-0\" (UID: \"97026040-37a7-4aa6-aad1-a9b204d4d329\") " pod="openstack/kube-state-metrics-0" Oct 09 10:43:18 crc kubenswrapper[4740]: I1009 10:43:18.023487 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97zmg\" (UniqueName: \"kubernetes.io/projected/97026040-37a7-4aa6-aad1-a9b204d4d329-kube-api-access-97zmg\") pod \"kube-state-metrics-0\" (UID: \"97026040-37a7-4aa6-aad1-a9b204d4d329\") " pod="openstack/kube-state-metrics-0" Oct 09 10:43:18 crc kubenswrapper[4740]: I1009 10:43:18.054726 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97zmg\" (UniqueName: \"kubernetes.io/projected/97026040-37a7-4aa6-aad1-a9b204d4d329-kube-api-access-97zmg\") pod \"kube-state-metrics-0\" (UID: \"97026040-37a7-4aa6-aad1-a9b204d4d329\") " pod="openstack/kube-state-metrics-0" Oct 09 10:43:18 crc kubenswrapper[4740]: I1009 10:43:18.123445 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.942798 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c6rld"] Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.946542 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld" Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.948418 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8dl7z" Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.948727 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.948969 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.959286 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6rld"] Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.975802 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cwdss"] Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.978086 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:20 crc kubenswrapper[4740]: I1009 10:43:20.983994 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cwdss"] Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074240 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-log-ovn\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074287 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-etc-ovs\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074339 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2pvm\" (UniqueName: \"kubernetes.io/projected/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-kube-api-access-j2pvm\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074366 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-scripts\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074384 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-run-ovn\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074407 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-run\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074428 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-scripts\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074456 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-log\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074476 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5ts4\" (UniqueName: \"kubernetes.io/projected/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-kube-api-access-f5ts4\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074492 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-ovn-controller-tls-certs\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074516 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-combined-ca-bundle\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074538 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-run\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.074553 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-lib\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175764 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-scripts\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175807 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-run-ovn\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175834 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-run\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175855 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-scripts\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175874 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-log\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175896 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5ts4\" (UniqueName: \"kubernetes.io/projected/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-kube-api-access-f5ts4\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175916 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-ovn-controller-tls-certs\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175941 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-combined-ca-bundle\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175964 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-lib\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.175980 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-run\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.176003 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-log-ovn\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.176028 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-etc-ovs\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.176070 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2pvm\" (UniqueName: \"kubernetes.io/projected/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-kube-api-access-j2pvm\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.177619 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-run\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.177718 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-log-ovn\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.177848 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-etc-ovs\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.177980 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-var-run-ovn\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.178047 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-run\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.178214 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-lib\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.178373 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-var-log\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.179377 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-scripts\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.180121 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-scripts\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.193595 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-ovn-controller-tls-certs\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.193652 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-combined-ca-bundle\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.195336 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5ts4\" (UniqueName: \"kubernetes.io/projected/5a1841e0-a15d-4dca-a1a4-6b50f338ddbc-kube-api-access-f5ts4\") pod \"ovn-controller-ovs-cwdss\" (UID: \"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc\") " pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.196502 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2pvm\" (UniqueName: \"kubernetes.io/projected/7f56ff38-de3a-4c48-8fc0-43e0eac26c55-kube-api-access-j2pvm\") pod \"ovn-controller-c6rld\" (UID: \"7f56ff38-de3a-4c48-8fc0-43e0eac26c55\") " pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.277146 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.294559 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.540343 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.541559 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.543650 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.543891 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dnwvm" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.544027 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.544240 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.546779 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.557529 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.582724 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.582830 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c37175d-6801-461a-82ba-ea611afdaebf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.582862 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8bp\" (UniqueName: \"kubernetes.io/projected/0c37175d-6801-461a-82ba-ea611afdaebf-kube-api-access-5v8bp\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.582916 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.583060 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c37175d-6801-461a-82ba-ea611afdaebf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.583087 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c37175d-6801-461a-82ba-ea611afdaebf-config\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.583120 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.583165 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.684603 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.684707 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c37175d-6801-461a-82ba-ea611afdaebf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.684728 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c37175d-6801-461a-82ba-ea611afdaebf-config\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.684777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.684822 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.684847 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.684866 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c37175d-6801-461a-82ba-ea611afdaebf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.684890 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v8bp\" (UniqueName: \"kubernetes.io/projected/0c37175d-6801-461a-82ba-ea611afdaebf-kube-api-access-5v8bp\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.686289 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c37175d-6801-461a-82ba-ea611afdaebf-config\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.686665 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c37175d-6801-461a-82ba-ea611afdaebf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.687291 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c37175d-6801-461a-82ba-ea611afdaebf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.687641 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.692197 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.692725 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.693252 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c37175d-6801-461a-82ba-ea611afdaebf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.717193 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v8bp\" (UniqueName: \"kubernetes.io/projected/0c37175d-6801-461a-82ba-ea611afdaebf-kube-api-access-5v8bp\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.728615 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0c37175d-6801-461a-82ba-ea611afdaebf\") " pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:21 crc kubenswrapper[4740]: I1009 10:43:21.888972 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:22 crc kubenswrapper[4740]: E1009 10:43:22.476106 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 09 10:43:22 crc kubenswrapper[4740]: E1009 10:43:22.476270 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q9grq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-szmdz_openstack(9c05fcab-7edd-46c8-883c-c6dd58821780): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 10:43:22 crc kubenswrapper[4740]: E1009 10:43:22.477625 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" podUID="9c05fcab-7edd-46c8-883c-c6dd58821780" Oct 09 10:43:22 crc kubenswrapper[4740]: E1009 10:43:22.519620 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 09 10:43:22 crc kubenswrapper[4740]: E1009 10:43:22.520144 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2z4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-n4bxk_openstack(e780ff7b-74c6-41aa-8a91-b209afe2f69c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 10:43:22 crc kubenswrapper[4740]: E1009 10:43:22.522227 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" podUID="e780ff7b-74c6-41aa-8a91-b209afe2f69c" Oct 09 10:43:22 crc kubenswrapper[4740]: I1009 10:43:22.999241 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.412378 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xdk5b"] Oct 09 10:43:23 crc kubenswrapper[4740]: W1009 10:43:23.418070 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18483b6e_9941_4d1c_af6e_2812758e0265.slice/crio-8609c5b0f024f710a56eab9e6b6ac024a38facdb11427f3932e5761144cc8a7b WatchSource:0}: Error finding container 8609c5b0f024f710a56eab9e6b6ac024a38facdb11427f3932e5761144cc8a7b: Status 404 returned error can't find the container with id 8609c5b0f024f710a56eab9e6b6ac024a38facdb11427f3932e5761144cc8a7b Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.418740 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:43:23 crc kubenswrapper[4740]: W1009 10:43:23.420181 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7517706_a284_4265_8deb_cbb523873afd.slice/crio-8dc5a21b1e3e158b7a084265b6a4ad1e92e345031394bcd1e912aef6e93ded85 WatchSource:0}: Error finding container 8dc5a21b1e3e158b7a084265b6a4ad1e92e345031394bcd1e912aef6e93ded85: Status 404 returned error can't find the container with id 8dc5a21b1e3e158b7a084265b6a4ad1e92e345031394bcd1e912aef6e93ded85 Oct 09 10:43:23 crc kubenswrapper[4740]: W1009 10:43:23.421943 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dce8908_af4b_4596_bed2_02788a615207.slice/crio-c5c86e5371413ca0b8fc8e4f7bf1402e803bbe94faa25bb783d7076a9ee9eb04 WatchSource:0}: Error finding container c5c86e5371413ca0b8fc8e4f7bf1402e803bbe94faa25bb783d7076a9ee9eb04: Status 404 returned error can't find the container with id c5c86e5371413ca0b8fc8e4f7bf1402e803bbe94faa25bb783d7076a9ee9eb04 Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.427521 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.432449 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tv5zx"] Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.432452 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.442012 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.515459 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-dns-svc\") pod \"9c05fcab-7edd-46c8-883c-c6dd58821780\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.515502 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2z4c\" (UniqueName: \"kubernetes.io/projected/e780ff7b-74c6-41aa-8a91-b209afe2f69c-kube-api-access-t2z4c\") pod \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\" (UID: \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\") " Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.515528 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e780ff7b-74c6-41aa-8a91-b209afe2f69c-config\") pod \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\" (UID: \"e780ff7b-74c6-41aa-8a91-b209afe2f69c\") " Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.515555 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9grq\" (UniqueName: \"kubernetes.io/projected/9c05fcab-7edd-46c8-883c-c6dd58821780-kube-api-access-q9grq\") pod \"9c05fcab-7edd-46c8-883c-c6dd58821780\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.515692 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-config\") pod \"9c05fcab-7edd-46c8-883c-c6dd58821780\" (UID: \"9c05fcab-7edd-46c8-883c-c6dd58821780\") " Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.516705 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e780ff7b-74c6-41aa-8a91-b209afe2f69c-config" (OuterVolumeSpecName: "config") pod "e780ff7b-74c6-41aa-8a91-b209afe2f69c" (UID: "e780ff7b-74c6-41aa-8a91-b209afe2f69c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.516804 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c05fcab-7edd-46c8-883c-c6dd58821780" (UID: "9c05fcab-7edd-46c8-883c-c6dd58821780"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.516882 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-config" (OuterVolumeSpecName: "config") pod "9c05fcab-7edd-46c8-883c-c6dd58821780" (UID: "9c05fcab-7edd-46c8-883c-c6dd58821780"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.524138 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c05fcab-7edd-46c8-883c-c6dd58821780-kube-api-access-q9grq" (OuterVolumeSpecName: "kube-api-access-q9grq") pod "9c05fcab-7edd-46c8-883c-c6dd58821780" (UID: "9c05fcab-7edd-46c8-883c-c6dd58821780"). InnerVolumeSpecName "kube-api-access-q9grq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.527316 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e780ff7b-74c6-41aa-8a91-b209afe2f69c-kube-api-access-t2z4c" (OuterVolumeSpecName: "kube-api-access-t2z4c") pod "e780ff7b-74c6-41aa-8a91-b209afe2f69c" (UID: "e780ff7b-74c6-41aa-8a91-b209afe2f69c"). InnerVolumeSpecName "kube-api-access-t2z4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.617201 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.617231 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c05fcab-7edd-46c8-883c-c6dd58821780-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.617242 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2z4c\" (UniqueName: \"kubernetes.io/projected/e780ff7b-74c6-41aa-8a91-b209afe2f69c-kube-api-access-t2z4c\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.617251 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e780ff7b-74c6-41aa-8a91-b209afe2f69c-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.617260 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9grq\" (UniqueName: \"kubernetes.io/projected/9c05fcab-7edd-46c8-883c-c6dd58821780-kube-api-access-q9grq\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.649737 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 10:43:23 crc kubenswrapper[4740]: W1009 10:43:23.664085 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf16fec59_73b1_4b57_ab47_c1767c6c2a7d.slice/crio-21f6327839c1f5f29f73f7c9dfe16040dc1fe721e2c8b61cfe075cc0d870598e WatchSource:0}: Error finding container 21f6327839c1f5f29f73f7c9dfe16040dc1fe721e2c8b61cfe075cc0d870598e: Status 404 returned error can't find the container with id 21f6327839c1f5f29f73f7c9dfe16040dc1fe721e2c8b61cfe075cc0d870598e Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.691874 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6rld"] Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.698447 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:43:23 crc kubenswrapper[4740]: W1009 10:43:23.702544 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f56ff38_de3a_4c48_8fc0_43e0eac26c55.slice/crio-d09b34fe0b95caf59b434dce7318b12a590d9ef3d613639bffdd8f741b8be57f WatchSource:0}: Error finding container d09b34fe0b95caf59b434dce7318b12a590d9ef3d613639bffdd8f741b8be57f: Status 404 returned error can't find the container with id d09b34fe0b95caf59b434dce7318b12a590d9ef3d613639bffdd8f741b8be57f Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.704376 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 10:43:23 crc kubenswrapper[4740]: W1009 10:43:23.706794 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd9d52b_8167_47f9_8c36_b75f88119ad5.slice/crio-a17768e2a2ee0767071830488b6495d1b5e5261974ed949e3bea9eca23c5873b WatchSource:0}: Error finding container a17768e2a2ee0767071830488b6495d1b5e5261974ed949e3bea9eca23c5873b: Status 404 returned error can't find the container with id a17768e2a2ee0767071830488b6495d1b5e5261974ed949e3bea9eca23c5873b Oct 09 10:43:23 crc kubenswrapper[4740]: I1009 10:43:23.775380 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 10:43:23 crc kubenswrapper[4740]: W1009 10:43:23.780630 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c37175d_6801_461a_82ba_ea611afdaebf.slice/crio-bf8feb6084be661a492f8d7c6e74c524f8292829c883846a0befa577eed0eec3 WatchSource:0}: Error finding container bf8feb6084be661a492f8d7c6e74c524f8292829c883846a0befa577eed0eec3: Status 404 returned error can't find the container with id bf8feb6084be661a492f8d7c6e74c524f8292829c883846a0befa577eed0eec3 Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.038179 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0c37175d-6801-461a-82ba-ea611afdaebf","Type":"ContainerStarted","Data":"bf8feb6084be661a492f8d7c6e74c524f8292829c883846a0befa577eed0eec3"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.041439 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6rld" event={"ID":"7f56ff38-de3a-4c48-8fc0-43e0eac26c55","Type":"ContainerStarted","Data":"d09b34fe0b95caf59b434dce7318b12a590d9ef3d613639bffdd8f741b8be57f"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.043355 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" event={"ID":"b7517706-a284-4265-8deb-cbb523873afd","Type":"ContainerStarted","Data":"3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.043393 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" event={"ID":"b7517706-a284-4265-8deb-cbb523873afd","Type":"ContainerStarted","Data":"8dc5a21b1e3e158b7a084265b6a4ad1e92e345031394bcd1e912aef6e93ded85"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.047498 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3dce8908-af4b-4596-bed2-02788a615207","Type":"ContainerStarted","Data":"c5c86e5371413ca0b8fc8e4f7bf1402e803bbe94faa25bb783d7076a9ee9eb04"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.049714 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" event={"ID":"9c05fcab-7edd-46c8-883c-c6dd58821780","Type":"ContainerDied","Data":"6a8c87c4a02dd95f6066f18251669a92efac031bc879febeebafc25480c8d9e0"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.049831 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-szmdz" Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.058040 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"187134d2-2fe9-4beb-beff-6a48162a1933","Type":"ContainerStarted","Data":"e7cf6430c9bd0d2c75ac02f7ff81bee00e811aaba181b39cc97adae9e47a4677"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.060519 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa98dfc6-da2e-42b0-a620-a07230e1833d","Type":"ContainerStarted","Data":"15c391a994676971ddfb4fcc296685cf22f3a13685de4b7fc64e8f38c2640173"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.072299 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" event={"ID":"18483b6e-9941-4d1c-af6e-2812758e0265","Type":"ContainerStarted","Data":"8609c5b0f024f710a56eab9e6b6ac024a38facdb11427f3932e5761144cc8a7b"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.073564 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dcd9d52b-8167-47f9-8c36-b75f88119ad5","Type":"ContainerStarted","Data":"a17768e2a2ee0767071830488b6495d1b5e5261974ed949e3bea9eca23c5873b"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.075214 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f16fec59-73b1-4b57-ab47-c1767c6c2a7d","Type":"ContainerStarted","Data":"21f6327839c1f5f29f73f7c9dfe16040dc1fe721e2c8b61cfe075cc0d870598e"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.076447 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"97026040-37a7-4aa6-aad1-a9b204d4d329","Type":"ContainerStarted","Data":"b2afae48b8ac6f6b9c5abcb56f77dd46ba08b162589bd4f95c339c35231a27e1"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.077545 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" event={"ID":"e780ff7b-74c6-41aa-8a91-b209afe2f69c","Type":"ContainerDied","Data":"974007d6d99cc7637249dfe12dafe26fb77043c17435b3fa77594d10557f404c"} Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.077624 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n4bxk" Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.208684 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-szmdz"] Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.227305 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-szmdz"] Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.246630 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n4bxk"] Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.254462 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n4bxk"] Oct 09 10:43:24 crc kubenswrapper[4740]: I1009 10:43:24.806199 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cwdss"] Oct 09 10:43:25 crc kubenswrapper[4740]: W1009 10:43:25.057236 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a1841e0_a15d_4dca_a1a4_6b50f338ddbc.slice/crio-feda4452b3352ed0c2cf669ff324cb265fa882fe05db7d7b6f40cfa460c4ec18 WatchSource:0}: Error finding container feda4452b3352ed0c2cf669ff324cb265fa882fe05db7d7b6f40cfa460c4ec18: Status 404 returned error can't find the container with id feda4452b3352ed0c2cf669ff324cb265fa882fe05db7d7b6f40cfa460c4ec18 Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.060145 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.089742 4740 generic.go:334] "Generic (PLEG): container finished" podID="b7517706-a284-4265-8deb-cbb523873afd" containerID="3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232" exitCode=0 Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.089845 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" event={"ID":"b7517706-a284-4265-8deb-cbb523873afd","Type":"ContainerDied","Data":"3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232"} Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.092941 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cwdss" event={"ID":"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc","Type":"ContainerStarted","Data":"feda4452b3352ed0c2cf669ff324cb265fa882fe05db7d7b6f40cfa460c4ec18"} Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.095226 4740 generic.go:334] "Generic (PLEG): container finished" podID="18483b6e-9941-4d1c-af6e-2812758e0265" containerID="eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13" exitCode=0 Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.095278 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" event={"ID":"18483b6e-9941-4d1c-af6e-2812758e0265","Type":"ContainerDied","Data":"eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13"} Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.583892 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.587129 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.589948 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.589964 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-kggzn" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.589980 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.590210 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.599872 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.650340 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52b13ae3-8184-4ea2-a6b5-14d739b1200e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.650380 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.650402 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4w4p\" (UniqueName: \"kubernetes.io/projected/52b13ae3-8184-4ea2-a6b5-14d739b1200e-kube-api-access-b4w4p\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.650424 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52b13ae3-8184-4ea2-a6b5-14d739b1200e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.650454 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b13ae3-8184-4ea2-a6b5-14d739b1200e-config\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.650728 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.650792 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.650823 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.751972 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52b13ae3-8184-4ea2-a6b5-14d739b1200e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.752027 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.752051 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4w4p\" (UniqueName: \"kubernetes.io/projected/52b13ae3-8184-4ea2-a6b5-14d739b1200e-kube-api-access-b4w4p\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.752372 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52b13ae3-8184-4ea2-a6b5-14d739b1200e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.752405 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b13ae3-8184-4ea2-a6b5-14d739b1200e-config\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.752450 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.752468 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.752484 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.753052 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.757561 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52b13ae3-8184-4ea2-a6b5-14d739b1200e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.759839 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52b13ae3-8184-4ea2-a6b5-14d739b1200e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.760661 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b13ae3-8184-4ea2-a6b5-14d739b1200e-config\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.762233 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.763962 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.764444 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b13ae3-8184-4ea2-a6b5-14d739b1200e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.765022 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c05fcab-7edd-46c8-883c-c6dd58821780" path="/var/lib/kubelet/pods/9c05fcab-7edd-46c8-883c-c6dd58821780/volumes" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.765392 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e780ff7b-74c6-41aa-8a91-b209afe2f69c" path="/var/lib/kubelet/pods/e780ff7b-74c6-41aa-8a91-b209afe2f69c/volumes" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.769842 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4w4p\" (UniqueName: \"kubernetes.io/projected/52b13ae3-8184-4ea2-a6b5-14d739b1200e-kube-api-access-b4w4p\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.789386 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"52b13ae3-8184-4ea2-a6b5-14d739b1200e\") " pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:25 crc kubenswrapper[4740]: I1009 10:43:25.911614 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.364785 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-pn5fj"] Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.367938 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.370655 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.376286 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pn5fj"] Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.466829 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2vpp\" (UniqueName: \"kubernetes.io/projected/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-kube-api-access-j2vpp\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.466957 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-ovn-rundir\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.466990 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-ovs-rundir\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.467128 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.467302 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-config\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.467508 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-combined-ca-bundle\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.518404 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tv5zx"] Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.542562 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hvkb9"] Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.543965 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.547623 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hvkb9"] Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.548190 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568372 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568416 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82s5n\" (UniqueName: \"kubernetes.io/projected/bef63c9c-1759-461f-be02-bc8ee3a1f548-kube-api-access-82s5n\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568455 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-combined-ca-bundle\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568507 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-config\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568524 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2vpp\" (UniqueName: \"kubernetes.io/projected/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-kube-api-access-j2vpp\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568546 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-ovn-rundir\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568563 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-ovs-rundir\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568579 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568613 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.568636 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-config\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.569205 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-config\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.570161 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-ovn-rundir\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.570508 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-ovs-rundir\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.577850 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-combined-ca-bundle\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.587901 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2vpp\" (UniqueName: \"kubernetes.io/projected/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-kube-api-access-j2vpp\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.591184 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94fc74ef-6b90-4c9a-9da5-d7eb116a7806-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pn5fj\" (UID: \"94fc74ef-6b90-4c9a-9da5-d7eb116a7806\") " pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.670556 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.670622 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82s5n\" (UniqueName: \"kubernetes.io/projected/bef63c9c-1759-461f-be02-bc8ee3a1f548-kube-api-access-82s5n\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.670726 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-config\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.670786 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.671640 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.671640 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-config\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.671654 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.688162 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82s5n\" (UniqueName: \"kubernetes.io/projected/bef63c9c-1759-461f-be02-bc8ee3a1f548-kube-api-access-82s5n\") pod \"dnsmasq-dns-7fd796d7df-hvkb9\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.732265 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pn5fj" Oct 09 10:43:26 crc kubenswrapper[4740]: I1009 10:43:26.867692 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:31 crc kubenswrapper[4740]: I1009 10:43:31.406008 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hvkb9"] Oct 09 10:43:31 crc kubenswrapper[4740]: I1009 10:43:31.461726 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pn5fj"] Oct 09 10:43:31 crc kubenswrapper[4740]: I1009 10:43:31.537454 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 10:43:31 crc kubenswrapper[4740]: W1009 10:43:31.563799 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbef63c9c_1759_461f_be02_bc8ee3a1f548.slice/crio-4c8370f86cdf0dc242888b09c8c74396977b4a078c70950fb2070c4f39645efb WatchSource:0}: Error finding container 4c8370f86cdf0dc242888b09c8c74396977b4a078c70950fb2070c4f39645efb: Status 404 returned error can't find the container with id 4c8370f86cdf0dc242888b09c8c74396977b4a078c70950fb2070c4f39645efb Oct 09 10:43:31 crc kubenswrapper[4740]: W1009 10:43:31.567040 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52b13ae3_8184_4ea2_a6b5_14d739b1200e.slice/crio-bb01ccdd86ccad45236877a66ee6e920edd9b23b34bd385ff3d91a0e6600bd96 WatchSource:0}: Error finding container bb01ccdd86ccad45236877a66ee6e920edd9b23b34bd385ff3d91a0e6600bd96: Status 404 returned error can't find the container with id bb01ccdd86ccad45236877a66ee6e920edd9b23b34bd385ff3d91a0e6600bd96 Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.171450 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f16fec59-73b1-4b57-ab47-c1767c6c2a7d","Type":"ContainerStarted","Data":"542397f295e857ec4af7d7047418c813d5b056ed40c1266ffbd9556cea63552f"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.171946 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.173331 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" event={"ID":"b7517706-a284-4265-8deb-cbb523873afd","Type":"ContainerStarted","Data":"d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.173432 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" podUID="b7517706-a284-4265-8deb-cbb523873afd" containerName="dnsmasq-dns" containerID="cri-o://d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d" gracePeriod=10 Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.173614 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.177571 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0c37175d-6801-461a-82ba-ea611afdaebf","Type":"ContainerStarted","Data":"d35439baedb9d758fe7083bf761a258d002f5ddbcc25b423496e876033aa8a61"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.179313 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3dce8908-af4b-4596-bed2-02788a615207","Type":"ContainerStarted","Data":"d4c09728094543afaa0094dcfddaa8ba320183ec3eb87ec08bd154cae2750dbb"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.197923 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=9.830578155 podStartE2EDuration="17.197892095s" podCreationTimestamp="2025-10-09 10:43:15 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.665996295 +0000 UTC m=+942.628196666" lastFinishedPulling="2025-10-09 10:43:31.033310185 +0000 UTC m=+949.995510606" observedRunningTime="2025-10-09 10:43:32.197140404 +0000 UTC m=+951.159340795" watchObservedRunningTime="2025-10-09 10:43:32.197892095 +0000 UTC m=+951.160092476" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.200656 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"97026040-37a7-4aa6-aad1-a9b204d4d329","Type":"ContainerStarted","Data":"97072505b68eafbf40eeaa31da55cadfb06b4e193e60ca5d17cde2ac2256a303"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.200737 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.203087 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pn5fj" event={"ID":"94fc74ef-6b90-4c9a-9da5-d7eb116a7806","Type":"ContainerStarted","Data":"1a7710dcad13128fa98a53ea343cf644231dd113a26e0de80d2a0a462fa456e6"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.205055 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"52b13ae3-8184-4ea2-a6b5-14d739b1200e","Type":"ContainerStarted","Data":"bb01ccdd86ccad45236877a66ee6e920edd9b23b34bd385ff3d91a0e6600bd96"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.207111 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dcd9d52b-8167-47f9-8c36-b75f88119ad5","Type":"ContainerStarted","Data":"00dc5edfd3790d8925efa638e214f829805318da3da0f60d90d59331066e937f"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.210845 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6rld" event={"ID":"7f56ff38-de3a-4c48-8fc0-43e0eac26c55","Type":"ContainerStarted","Data":"d40f889f169fa07df746c70fe3f28d432d1579f1b94ddbdc5be2a58924bc97e0"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.211152 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-c6rld" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.224305 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cwdss" event={"ID":"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc","Type":"ContainerStarted","Data":"174466ad1c20b4991d6c0e5e9c44a630e8cfff581a77112adfad5d6d1a3873f9"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.226599 4740 generic.go:334] "Generic (PLEG): container finished" podID="bef63c9c-1759-461f-be02-bc8ee3a1f548" containerID="6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363" exitCode=0 Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.226651 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" event={"ID":"bef63c9c-1759-461f-be02-bc8ee3a1f548","Type":"ContainerDied","Data":"6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.226673 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" event={"ID":"bef63c9c-1759-461f-be02-bc8ee3a1f548","Type":"ContainerStarted","Data":"4c8370f86cdf0dc242888b09c8c74396977b4a078c70950fb2070c4f39645efb"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.237948 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" event={"ID":"18483b6e-9941-4d1c-af6e-2812758e0265","Type":"ContainerStarted","Data":"f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500"} Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.238224 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.241603 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" podStartSLOduration=21.805610156 podStartE2EDuration="22.241583996s" podCreationTimestamp="2025-10-09 10:43:10 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.423916807 +0000 UTC m=+942.386117188" lastFinishedPulling="2025-10-09 10:43:23.859890657 +0000 UTC m=+942.822091028" observedRunningTime="2025-10-09 10:43:32.237651398 +0000 UTC m=+951.199851789" watchObservedRunningTime="2025-10-09 10:43:32.241583996 +0000 UTC m=+951.203784377" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.297121 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=7.336581137 podStartE2EDuration="15.297099363s" podCreationTimestamp="2025-10-09 10:43:17 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.708629807 +0000 UTC m=+942.670830188" lastFinishedPulling="2025-10-09 10:43:31.669148033 +0000 UTC m=+950.631348414" observedRunningTime="2025-10-09 10:43:32.295817918 +0000 UTC m=+951.258018319" watchObservedRunningTime="2025-10-09 10:43:32.297099363 +0000 UTC m=+951.259299744" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.338704 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-c6rld" podStartSLOduration=5.01805211 podStartE2EDuration="12.338683857s" podCreationTimestamp="2025-10-09 10:43:20 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.710722075 +0000 UTC m=+942.672922466" lastFinishedPulling="2025-10-09 10:43:31.031353792 +0000 UTC m=+949.993554213" observedRunningTime="2025-10-09 10:43:32.333573606 +0000 UTC m=+951.295773987" watchObservedRunningTime="2025-10-09 10:43:32.338683857 +0000 UTC m=+951.300884238" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.353927 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" podStartSLOduration=20.855447986 podStartE2EDuration="21.353905506s" podCreationTimestamp="2025-10-09 10:43:11 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.420484492 +0000 UTC m=+942.382684873" lastFinishedPulling="2025-10-09 10:43:23.918942012 +0000 UTC m=+942.881142393" observedRunningTime="2025-10-09 10:43:32.352346143 +0000 UTC m=+951.314546524" watchObservedRunningTime="2025-10-09 10:43:32.353905506 +0000 UTC m=+951.316105887" Oct 09 10:43:32 crc kubenswrapper[4740]: I1009 10:43:32.902553 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.073649 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-config\") pod \"b7517706-a284-4265-8deb-cbb523873afd\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.074081 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-dns-svc\") pod \"b7517706-a284-4265-8deb-cbb523873afd\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.074117 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htnlc\" (UniqueName: \"kubernetes.io/projected/b7517706-a284-4265-8deb-cbb523873afd-kube-api-access-htnlc\") pod \"b7517706-a284-4265-8deb-cbb523873afd\" (UID: \"b7517706-a284-4265-8deb-cbb523873afd\") " Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.085085 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7517706-a284-4265-8deb-cbb523873afd-kube-api-access-htnlc" (OuterVolumeSpecName: "kube-api-access-htnlc") pod "b7517706-a284-4265-8deb-cbb523873afd" (UID: "b7517706-a284-4265-8deb-cbb523873afd"). InnerVolumeSpecName "kube-api-access-htnlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.124355 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-config" (OuterVolumeSpecName: "config") pod "b7517706-a284-4265-8deb-cbb523873afd" (UID: "b7517706-a284-4265-8deb-cbb523873afd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.138903 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7517706-a284-4265-8deb-cbb523873afd" (UID: "b7517706-a284-4265-8deb-cbb523873afd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.175566 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.175598 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htnlc\" (UniqueName: \"kubernetes.io/projected/b7517706-a284-4265-8deb-cbb523873afd-kube-api-access-htnlc\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.175611 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7517706-a284-4265-8deb-cbb523873afd-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.271202 4740 generic.go:334] "Generic (PLEG): container finished" podID="5a1841e0-a15d-4dca-a1a4-6b50f338ddbc" containerID="174466ad1c20b4991d6c0e5e9c44a630e8cfff581a77112adfad5d6d1a3873f9" exitCode=0 Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.271320 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cwdss" event={"ID":"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc","Type":"ContainerDied","Data":"174466ad1c20b4991d6c0e5e9c44a630e8cfff581a77112adfad5d6d1a3873f9"} Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.273820 4740 generic.go:334] "Generic (PLEG): container finished" podID="b7517706-a284-4265-8deb-cbb523873afd" containerID="d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d" exitCode=0 Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.273919 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.274021 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" event={"ID":"b7517706-a284-4265-8deb-cbb523873afd","Type":"ContainerDied","Data":"d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d"} Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.274093 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tv5zx" event={"ID":"b7517706-a284-4265-8deb-cbb523873afd","Type":"ContainerDied","Data":"8dc5a21b1e3e158b7a084265b6a4ad1e92e345031394bcd1e912aef6e93ded85"} Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.274114 4740 scope.go:117] "RemoveContainer" containerID="d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.276738 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" event={"ID":"bef63c9c-1759-461f-be02-bc8ee3a1f548","Type":"ContainerStarted","Data":"abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5"} Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.277445 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.279159 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"187134d2-2fe9-4beb-beff-6a48162a1933","Type":"ContainerStarted","Data":"bcd7f5081393f9b0fd83b07b79c9fd1569cb832e594a750001a732dca196c1c0"} Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.281514 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa98dfc6-da2e-42b0-a620-a07230e1833d","Type":"ContainerStarted","Data":"cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c"} Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.329554 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tv5zx"] Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.334802 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tv5zx"] Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.345465 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" podStartSLOduration=7.345449945 podStartE2EDuration="7.345449945s" podCreationTimestamp="2025-10-09 10:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:43:33.34164601 +0000 UTC m=+952.303846401" watchObservedRunningTime="2025-10-09 10:43:33.345449945 +0000 UTC m=+952.307650316" Oct 09 10:43:33 crc kubenswrapper[4740]: I1009 10:43:33.766668 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7517706-a284-4265-8deb-cbb523873afd" path="/var/lib/kubelet/pods/b7517706-a284-4265-8deb-cbb523873afd/volumes" Oct 09 10:43:35 crc kubenswrapper[4740]: I1009 10:43:35.408377 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:43:35 crc kubenswrapper[4740]: I1009 10:43:35.408844 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:43:35 crc kubenswrapper[4740]: I1009 10:43:35.408909 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:43:35 crc kubenswrapper[4740]: I1009 10:43:35.410349 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db6bdc02b2d1bf480bf563dc4b4a9b65b436c587d39e3c847d517ccd6a5d7f1c"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 10:43:35 crc kubenswrapper[4740]: I1009 10:43:35.410446 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://db6bdc02b2d1bf480bf563dc4b4a9b65b436c587d39e3c847d517ccd6a5d7f1c" gracePeriod=600 Oct 09 10:43:36 crc kubenswrapper[4740]: I1009 10:43:36.410020 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:38 crc kubenswrapper[4740]: I1009 10:43:38.128375 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.158132 4740 scope.go:117] "RemoveContainer" containerID="3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232" Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.341210 4740 generic.go:334] "Generic (PLEG): container finished" podID="dcd9d52b-8167-47f9-8c36-b75f88119ad5" containerID="00dc5edfd3790d8925efa638e214f829805318da3da0f60d90d59331066e937f" exitCode=0 Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.341275 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dcd9d52b-8167-47f9-8c36-b75f88119ad5","Type":"ContainerDied","Data":"00dc5edfd3790d8925efa638e214f829805318da3da0f60d90d59331066e937f"} Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.345647 4740 generic.go:334] "Generic (PLEG): container finished" podID="3dce8908-af4b-4596-bed2-02788a615207" containerID="d4c09728094543afaa0094dcfddaa8ba320183ec3eb87ec08bd154cae2750dbb" exitCode=0 Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.345748 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3dce8908-af4b-4596-bed2-02788a615207","Type":"ContainerDied","Data":"d4c09728094543afaa0094dcfddaa8ba320183ec3eb87ec08bd154cae2750dbb"} Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.348567 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="db6bdc02b2d1bf480bf563dc4b4a9b65b436c587d39e3c847d517ccd6a5d7f1c" exitCode=0 Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.348612 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"db6bdc02b2d1bf480bf563dc4b4a9b65b436c587d39e3c847d517ccd6a5d7f1c"} Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.473912 4740 scope.go:117] "RemoveContainer" containerID="d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d" Oct 09 10:43:40 crc kubenswrapper[4740]: E1009 10:43:40.476063 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d\": container with ID starting with d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d not found: ID does not exist" containerID="d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d" Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.476109 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d"} err="failed to get container status \"d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d\": rpc error: code = NotFound desc = could not find container \"d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d\": container with ID starting with d6f0e2c93038e9b8d3e8d993ec14de8a11fddea6ba1a6126574266599a55d88d not found: ID does not exist" Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.476139 4740 scope.go:117] "RemoveContainer" containerID="3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232" Oct 09 10:43:40 crc kubenswrapper[4740]: E1009 10:43:40.476419 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232\": container with ID starting with 3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232 not found: ID does not exist" containerID="3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232" Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.476443 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232"} err="failed to get container status \"3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232\": rpc error: code = NotFound desc = could not find container \"3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232\": container with ID starting with 3e973b941d6e005ee662cbd7899b3a97b969bf83edcbea3f2113adc0fef58232 not found: ID does not exist" Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.476458 4740 scope.go:117] "RemoveContainer" containerID="afadc9267ef0dcffe417993e78f8ce5f9baf0ee72c33f5f9de1c87bbb7818e64" Oct 09 10:43:40 crc kubenswrapper[4740]: I1009 10:43:40.934961 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.357656 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pn5fj" event={"ID":"94fc74ef-6b90-4c9a-9da5-d7eb116a7806","Type":"ContainerStarted","Data":"6bf1f39fe476247f3c7251354a9740a8677ac04551dc87675845d45fbf7e1b2d"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.360678 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"fbbd1d786738a0dbe0197a069ad3e53334cad14f3901ee957620b2bd7f765083"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.362945 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"52b13ae3-8184-4ea2-a6b5-14d739b1200e","Type":"ContainerStarted","Data":"79ab5ac48b8322a2311a0be3234be6d817edd54f48d279dd4e1202ec6a9a67b9"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.362983 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"52b13ae3-8184-4ea2-a6b5-14d739b1200e","Type":"ContainerStarted","Data":"1a2421bb814e29db7a00df8dc11b9b0acb2798e27146eb51bd4375ae110531dc"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.365405 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cwdss" event={"ID":"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc","Type":"ContainerStarted","Data":"f41d29a587cdc63c0260cf58c1859fd077530561cd2d77964f360e25c73b1c79"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.365460 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cwdss" event={"ID":"5a1841e0-a15d-4dca-a1a4-6b50f338ddbc","Type":"ContainerStarted","Data":"dd41b4f504492bd824ad6d2ffdfca9bae348a3643597836fd981d651486b7e70"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.365605 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.367943 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dcd9d52b-8167-47f9-8c36-b75f88119ad5","Type":"ContainerStarted","Data":"ce8e8d9032efb32bd9befdd235d8d41c6e022aae25798602bbcdf920ffe00208"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.374268 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0c37175d-6801-461a-82ba-ea611afdaebf","Type":"ContainerStarted","Data":"b05ed4f0790aa9b62d268d3e25fa7337d211890d18f950e646a5190a823e4a1f"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.380071 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3dce8908-af4b-4596-bed2-02788a615207","Type":"ContainerStarted","Data":"a55dd0c4a2b6d356519364857f3be29b9b6a52c33ee62966acb06ea986674866"} Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.381314 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-pn5fj" podStartSLOduration=6.367868689 podStartE2EDuration="15.381299942s" podCreationTimestamp="2025-10-09 10:43:26 +0000 UTC" firstStartedPulling="2025-10-09 10:43:31.536804053 +0000 UTC m=+950.499004434" lastFinishedPulling="2025-10-09 10:43:40.550235306 +0000 UTC m=+959.512435687" observedRunningTime="2025-10-09 10:43:41.374779293 +0000 UTC m=+960.336979664" watchObservedRunningTime="2025-10-09 10:43:41.381299942 +0000 UTC m=+960.343500323" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.401899 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cwdss" podStartSLOduration=15.54797747 podStartE2EDuration="21.401877258s" podCreationTimestamp="2025-10-09 10:43:20 +0000 UTC" firstStartedPulling="2025-10-09 10:43:25.059946512 +0000 UTC m=+944.022146883" lastFinishedPulling="2025-10-09 10:43:30.91384628 +0000 UTC m=+949.876046671" observedRunningTime="2025-10-09 10:43:41.396998354 +0000 UTC m=+960.359198775" watchObservedRunningTime="2025-10-09 10:43:41.401877258 +0000 UTC m=+960.364077639" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.433406 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.041014495 podStartE2EDuration="27.433367544s" podCreationTimestamp="2025-10-09 10:43:14 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.709149112 +0000 UTC m=+942.671349513" lastFinishedPulling="2025-10-09 10:43:31.101502171 +0000 UTC m=+950.063702562" observedRunningTime="2025-10-09 10:43:41.428394828 +0000 UTC m=+960.390595209" watchObservedRunningTime="2025-10-09 10:43:41.433367544 +0000 UTC m=+960.395567925" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.473379 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=15.864079635 podStartE2EDuration="17.473359014s" podCreationTimestamp="2025-10-09 10:43:24 +0000 UTC" firstStartedPulling="2025-10-09 10:43:31.592496535 +0000 UTC m=+950.554721766" lastFinishedPulling="2025-10-09 10:43:33.201800764 +0000 UTC m=+952.164001145" observedRunningTime="2025-10-09 10:43:41.470575748 +0000 UTC m=+960.432776179" watchObservedRunningTime="2025-10-09 10:43:41.473359014 +0000 UTC m=+960.435559405" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.495033 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.709671409 podStartE2EDuration="21.49501021s" podCreationTimestamp="2025-10-09 10:43:20 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.782503569 +0000 UTC m=+942.744703950" lastFinishedPulling="2025-10-09 10:43:40.56784235 +0000 UTC m=+959.530042751" observedRunningTime="2025-10-09 10:43:41.492023258 +0000 UTC m=+960.454223649" watchObservedRunningTime="2025-10-09 10:43:41.49501021 +0000 UTC m=+960.457210611" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.869957 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.880019 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.458954599 podStartE2EDuration="27.880004968s" podCreationTimestamp="2025-10-09 10:43:14 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.425320675 +0000 UTC m=+942.387521056" lastFinishedPulling="2025-10-09 10:43:30.846371024 +0000 UTC m=+949.808571425" observedRunningTime="2025-10-09 10:43:41.522169367 +0000 UTC m=+960.484369748" watchObservedRunningTime="2025-10-09 10:43:41.880004968 +0000 UTC m=+960.842205349" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.884176 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hvkb9"] Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.890474 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.918500 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8zqdw"] Oct 09 10:43:41 crc kubenswrapper[4740]: E1009 10:43:41.918983 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7517706-a284-4265-8deb-cbb523873afd" containerName="init" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.918999 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7517706-a284-4265-8deb-cbb523873afd" containerName="init" Oct 09 10:43:41 crc kubenswrapper[4740]: E1009 10:43:41.919019 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7517706-a284-4265-8deb-cbb523873afd" containerName="dnsmasq-dns" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.919027 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7517706-a284-4265-8deb-cbb523873afd" containerName="dnsmasq-dns" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.919175 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7517706-a284-4265-8deb-cbb523873afd" containerName="dnsmasq-dns" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.919889 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.926686 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.932844 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.932900 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-config\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.932917 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.933029 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gmsl\" (UniqueName: \"kubernetes.io/projected/6fa0d50b-d592-4264-aedc-a407f94125be-kube-api-access-8gmsl\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.933052 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:41 crc kubenswrapper[4740]: I1009 10:43:41.933115 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8zqdw"] Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.034740 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gmsl\" (UniqueName: \"kubernetes.io/projected/6fa0d50b-d592-4264-aedc-a407f94125be-kube-api-access-8gmsl\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.034830 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.034870 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.034917 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-config\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.034939 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.035877 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.036708 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.037002 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-config\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.037146 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.055702 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gmsl\" (UniqueName: \"kubernetes.io/projected/6fa0d50b-d592-4264-aedc-a407f94125be-kube-api-access-8gmsl\") pod \"dnsmasq-dns-86db49b7ff-8zqdw\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.245911 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.396297 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.397244 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" podUID="bef63c9c-1759-461f-be02-bc8ee3a1f548" containerName="dnsmasq-dns" containerID="cri-o://abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5" gracePeriod=10 Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.704486 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8zqdw"] Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.774565 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.846445 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-config\") pod \"bef63c9c-1759-461f-be02-bc8ee3a1f548\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.846487 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-dns-svc\") pod \"bef63c9c-1759-461f-be02-bc8ee3a1f548\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.846569 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82s5n\" (UniqueName: \"kubernetes.io/projected/bef63c9c-1759-461f-be02-bc8ee3a1f548-kube-api-access-82s5n\") pod \"bef63c9c-1759-461f-be02-bc8ee3a1f548\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.846608 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-ovsdbserver-nb\") pod \"bef63c9c-1759-461f-be02-bc8ee3a1f548\" (UID: \"bef63c9c-1759-461f-be02-bc8ee3a1f548\") " Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.852515 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef63c9c-1759-461f-be02-bc8ee3a1f548-kube-api-access-82s5n" (OuterVolumeSpecName: "kube-api-access-82s5n") pod "bef63c9c-1759-461f-be02-bc8ee3a1f548" (UID: "bef63c9c-1759-461f-be02-bc8ee3a1f548"). InnerVolumeSpecName "kube-api-access-82s5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.886780 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-config" (OuterVolumeSpecName: "config") pod "bef63c9c-1759-461f-be02-bc8ee3a1f548" (UID: "bef63c9c-1759-461f-be02-bc8ee3a1f548"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.888943 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bef63c9c-1759-461f-be02-bc8ee3a1f548" (UID: "bef63c9c-1759-461f-be02-bc8ee3a1f548"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.889803 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.890558 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bef63c9c-1759-461f-be02-bc8ee3a1f548" (UID: "bef63c9c-1759-461f-be02-bc8ee3a1f548"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.926530 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.949119 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.949156 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.949169 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82s5n\" (UniqueName: \"kubernetes.io/projected/bef63c9c-1759-461f-be02-bc8ee3a1f548-kube-api-access-82s5n\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:42 crc kubenswrapper[4740]: I1009 10:43:42.949184 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bef63c9c-1759-461f-be02-bc8ee3a1f548-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.408917 4740 generic.go:334] "Generic (PLEG): container finished" podID="bef63c9c-1759-461f-be02-bc8ee3a1f548" containerID="abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5" exitCode=0 Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.408982 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.409024 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" event={"ID":"bef63c9c-1759-461f-be02-bc8ee3a1f548","Type":"ContainerDied","Data":"abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5"} Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.409063 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hvkb9" event={"ID":"bef63c9c-1759-461f-be02-bc8ee3a1f548","Type":"ContainerDied","Data":"4c8370f86cdf0dc242888b09c8c74396977b4a078c70950fb2070c4f39645efb"} Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.409089 4740 scope.go:117] "RemoveContainer" containerID="abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.412389 4740 generic.go:334] "Generic (PLEG): container finished" podID="6fa0d50b-d592-4264-aedc-a407f94125be" containerID="6da78e4f6dc644a4aeffa7d1e4f8cad6932046da17c0356610ef620262ff7fae" exitCode=0 Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.412966 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" event={"ID":"6fa0d50b-d592-4264-aedc-a407f94125be","Type":"ContainerDied","Data":"6da78e4f6dc644a4aeffa7d1e4f8cad6932046da17c0356610ef620262ff7fae"} Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.414504 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" event={"ID":"6fa0d50b-d592-4264-aedc-a407f94125be","Type":"ContainerStarted","Data":"136e9464f4e5ed2bc63ecbd6cdc98ac2ffde6f449f624af2e4b60d472afcebe7"} Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.454314 4740 scope.go:117] "RemoveContainer" containerID="6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.482025 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.541959 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hvkb9"] Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.550262 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hvkb9"] Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.563632 4740 scope.go:117] "RemoveContainer" containerID="abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5" Oct 09 10:43:43 crc kubenswrapper[4740]: E1009 10:43:43.564975 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5\": container with ID starting with abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5 not found: ID does not exist" containerID="abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.565120 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5"} err="failed to get container status \"abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5\": rpc error: code = NotFound desc = could not find container \"abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5\": container with ID starting with abc5a19c3dcf61db9f3102ed6073c528d502e922cd722c627c43c308a37007f5 not found: ID does not exist" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.565152 4740 scope.go:117] "RemoveContainer" containerID="6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363" Oct 09 10:43:43 crc kubenswrapper[4740]: E1009 10:43:43.565464 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363\": container with ID starting with 6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363 not found: ID does not exist" containerID="6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.565488 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363"} err="failed to get container status \"6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363\": rpc error: code = NotFound desc = could not find container \"6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363\": container with ID starting with 6ca0a4ed9e69d6c5430a27d821e7f1c817d13fc853e72ea977b11a83f2818363 not found: ID does not exist" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.765453 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef63c9c-1759-461f-be02-bc8ee3a1f548" path="/var/lib/kubelet/pods/bef63c9c-1759-461f-be02-bc8ee3a1f548/volumes" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.912452 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:43 crc kubenswrapper[4740]: I1009 10:43:43.979651 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:44 crc kubenswrapper[4740]: I1009 10:43:44.426529 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" event={"ID":"6fa0d50b-d592-4264-aedc-a407f94125be","Type":"ContainerStarted","Data":"c5a8859f52e10bf94b4006029713a6d0e270865ed8e2a0791cf9b98672a20063"} Oct 09 10:43:44 crc kubenswrapper[4740]: I1009 10:43:44.426936 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:44 crc kubenswrapper[4740]: I1009 10:43:44.429598 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:44 crc kubenswrapper[4740]: I1009 10:43:44.447365 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" podStartSLOduration=3.447334826 podStartE2EDuration="3.447334826s" podCreationTimestamp="2025-10-09 10:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:43:44.444742815 +0000 UTC m=+963.406943256" watchObservedRunningTime="2025-10-09 10:43:44.447334826 +0000 UTC m=+963.409535247" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.402797 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.403001 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.478092 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.522984 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.523041 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.611825 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 09 10:43:45 crc kubenswrapper[4740]: E1009 10:43:45.612253 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef63c9c-1759-461f-be02-bc8ee3a1f548" containerName="init" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.612273 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef63c9c-1759-461f-be02-bc8ee3a1f548" containerName="init" Oct 09 10:43:45 crc kubenswrapper[4740]: E1009 10:43:45.612294 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef63c9c-1759-461f-be02-bc8ee3a1f548" containerName="dnsmasq-dns" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.612303 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef63c9c-1759-461f-be02-bc8ee3a1f548" containerName="dnsmasq-dns" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.612637 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef63c9c-1759-461f-be02-bc8ee3a1f548" containerName="dnsmasq-dns" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.619185 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.622385 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.622715 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.622917 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-skmjx" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.623114 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.634478 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.708799 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda487bd-e994-4fce-86f9-50e85aaf30b2-config\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.708851 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eda487bd-e994-4fce-86f9-50e85aaf30b2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.708991 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda487bd-e994-4fce-86f9-50e85aaf30b2-scripts\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.709040 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.709168 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.709252 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.709407 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdg7\" (UniqueName: \"kubernetes.io/projected/eda487bd-e994-4fce-86f9-50e85aaf30b2-kube-api-access-ftdg7\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.810812 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda487bd-e994-4fce-86f9-50e85aaf30b2-scripts\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.810882 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.810935 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.810973 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.811018 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdg7\" (UniqueName: \"kubernetes.io/projected/eda487bd-e994-4fce-86f9-50e85aaf30b2-kube-api-access-ftdg7\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.811053 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda487bd-e994-4fce-86f9-50e85aaf30b2-config\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.811078 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eda487bd-e994-4fce-86f9-50e85aaf30b2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.811508 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eda487bd-e994-4fce-86f9-50e85aaf30b2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.811936 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda487bd-e994-4fce-86f9-50e85aaf30b2-scripts\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.812317 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda487bd-e994-4fce-86f9-50e85aaf30b2-config\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.817348 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.820899 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.823634 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eda487bd-e994-4fce-86f9-50e85aaf30b2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.832801 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdg7\" (UniqueName: \"kubernetes.io/projected/eda487bd-e994-4fce-86f9-50e85aaf30b2-kube-api-access-ftdg7\") pod \"ovn-northd-0\" (UID: \"eda487bd-e994-4fce-86f9-50e85aaf30b2\") " pod="openstack/ovn-northd-0" Oct 09 10:43:45 crc kubenswrapper[4740]: I1009 10:43:45.940945 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 10:43:46 crc kubenswrapper[4740]: I1009 10:43:46.374274 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 10:43:46 crc kubenswrapper[4740]: W1009 10:43:46.382943 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeda487bd_e994_4fce_86f9_50e85aaf30b2.slice/crio-6df66febc35ab5e6032dd9286ea64a91479b52592fb388b36b825678b7e5ade9 WatchSource:0}: Error finding container 6df66febc35ab5e6032dd9286ea64a91479b52592fb388b36b825678b7e5ade9: Status 404 returned error can't find the container with id 6df66febc35ab5e6032dd9286ea64a91479b52592fb388b36b825678b7e5ade9 Oct 09 10:43:46 crc kubenswrapper[4740]: I1009 10:43:46.447046 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eda487bd-e994-4fce-86f9-50e85aaf30b2","Type":"ContainerStarted","Data":"6df66febc35ab5e6032dd9286ea64a91479b52592fb388b36b825678b7e5ade9"} Oct 09 10:43:47 crc kubenswrapper[4740]: I1009 10:43:47.580556 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:47 crc kubenswrapper[4740]: I1009 10:43:47.629811 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.232720 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8zqdw"] Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.233220 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" podUID="6fa0d50b-d592-4264-aedc-a407f94125be" containerName="dnsmasq-dns" containerID="cri-o://c5a8859f52e10bf94b4006029713a6d0e270865ed8e2a0791cf9b98672a20063" gracePeriod=10 Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.235906 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.300257 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-xfmk7"] Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.301490 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.339313 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xfmk7"] Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.454308 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-config\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.454367 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.454460 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-dns-svc\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.454605 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.454637 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phwgn\" (UniqueName: \"kubernetes.io/projected/15e8a2fd-eacd-4108-af4c-355e0e923d2d-kube-api-access-phwgn\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.462489 4740 generic.go:334] "Generic (PLEG): container finished" podID="6fa0d50b-d592-4264-aedc-a407f94125be" containerID="c5a8859f52e10bf94b4006029713a6d0e270865ed8e2a0791cf9b98672a20063" exitCode=0 Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.462559 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" event={"ID":"6fa0d50b-d592-4264-aedc-a407f94125be","Type":"ContainerDied","Data":"c5a8859f52e10bf94b4006029713a6d0e270865ed8e2a0791cf9b98672a20063"} Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.463788 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eda487bd-e994-4fce-86f9-50e85aaf30b2","Type":"ContainerStarted","Data":"071636f5012d47fee7d033d2c6f24b9c551cd5bf539cc82f59c8f4a418455fa9"} Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.463842 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eda487bd-e994-4fce-86f9-50e85aaf30b2","Type":"ContainerStarted","Data":"ac4a0b8628e685ad3bc8756352986f984c14d6d60ca00ca3ed03ca2da6f1260d"} Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.463893 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.481944 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.446213564 podStartE2EDuration="3.481924878s" podCreationTimestamp="2025-10-09 10:43:45 +0000 UTC" firstStartedPulling="2025-10-09 10:43:46.38508284 +0000 UTC m=+965.347283221" lastFinishedPulling="2025-10-09 10:43:47.420794154 +0000 UTC m=+966.382994535" observedRunningTime="2025-10-09 10:43:48.481822276 +0000 UTC m=+967.444022657" watchObservedRunningTime="2025-10-09 10:43:48.481924878 +0000 UTC m=+967.444125259" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.556254 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.556301 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phwgn\" (UniqueName: \"kubernetes.io/projected/15e8a2fd-eacd-4108-af4c-355e0e923d2d-kube-api-access-phwgn\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.556400 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-config\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.556436 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.556523 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-dns-svc\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.558393 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.560861 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-config\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.560894 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.560937 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-dns-svc\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.574259 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phwgn\" (UniqueName: \"kubernetes.io/projected/15e8a2fd-eacd-4108-af4c-355e0e923d2d-kube-api-access-phwgn\") pod \"dnsmasq-dns-698758b865-xfmk7\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:48 crc kubenswrapper[4740]: I1009 10:43:48.617907 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.055624 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xfmk7"] Oct 09 10:43:49 crc kubenswrapper[4740]: W1009 10:43:49.073243 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15e8a2fd_eacd_4108_af4c_355e0e923d2d.slice/crio-88e458d6b3d38aaaf2693c65b40764b2b55e4e6427796ba90a0583017aeeb32e WatchSource:0}: Error finding container 88e458d6b3d38aaaf2693c65b40764b2b55e4e6427796ba90a0583017aeeb32e: Status 404 returned error can't find the container with id 88e458d6b3d38aaaf2693c65b40764b2b55e4e6427796ba90a0583017aeeb32e Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.168050 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.265563 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gmsl\" (UniqueName: \"kubernetes.io/projected/6fa0d50b-d592-4264-aedc-a407f94125be-kube-api-access-8gmsl\") pod \"6fa0d50b-d592-4264-aedc-a407f94125be\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.265615 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-config\") pod \"6fa0d50b-d592-4264-aedc-a407f94125be\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.265695 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-nb\") pod \"6fa0d50b-d592-4264-aedc-a407f94125be\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.265797 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-sb\") pod \"6fa0d50b-d592-4264-aedc-a407f94125be\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.265862 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-dns-svc\") pod \"6fa0d50b-d592-4264-aedc-a407f94125be\" (UID: \"6fa0d50b-d592-4264-aedc-a407f94125be\") " Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.271375 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa0d50b-d592-4264-aedc-a407f94125be-kube-api-access-8gmsl" (OuterVolumeSpecName: "kube-api-access-8gmsl") pod "6fa0d50b-d592-4264-aedc-a407f94125be" (UID: "6fa0d50b-d592-4264-aedc-a407f94125be"). InnerVolumeSpecName "kube-api-access-8gmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.309663 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-config" (OuterVolumeSpecName: "config") pod "6fa0d50b-d592-4264-aedc-a407f94125be" (UID: "6fa0d50b-d592-4264-aedc-a407f94125be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.311530 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6fa0d50b-d592-4264-aedc-a407f94125be" (UID: "6fa0d50b-d592-4264-aedc-a407f94125be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.320283 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6fa0d50b-d592-4264-aedc-a407f94125be" (UID: "6fa0d50b-d592-4264-aedc-a407f94125be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.324235 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fa0d50b-d592-4264-aedc-a407f94125be" (UID: "6fa0d50b-d592-4264-aedc-a407f94125be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.367830 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.367860 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.367869 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.367879 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gmsl\" (UniqueName: \"kubernetes.io/projected/6fa0d50b-d592-4264-aedc-a407f94125be-kube-api-access-8gmsl\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.367890 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa0d50b-d592-4264-aedc-a407f94125be-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.462348 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 09 10:43:49 crc kubenswrapper[4740]: E1009 10:43:49.463206 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa0d50b-d592-4264-aedc-a407f94125be" containerName="init" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.463239 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa0d50b-d592-4264-aedc-a407f94125be" containerName="init" Oct 09 10:43:49 crc kubenswrapper[4740]: E1009 10:43:49.463272 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa0d50b-d592-4264-aedc-a407f94125be" containerName="dnsmasq-dns" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.463284 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa0d50b-d592-4264-aedc-a407f94125be" containerName="dnsmasq-dns" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.463548 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa0d50b-d592-4264-aedc-a407f94125be" containerName="dnsmasq-dns" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.474173 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.475786 4740 generic.go:334] "Generic (PLEG): container finished" podID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" containerID="e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914" exitCode=0 Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.476084 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.479073 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xfmk7" event={"ID":"15e8a2fd-eacd-4108-af4c-355e0e923d2d","Type":"ContainerDied","Data":"e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914"} Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.485301 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xfmk7" event={"ID":"15e8a2fd-eacd-4108-af4c-355e0e923d2d","Type":"ContainerStarted","Data":"88e458d6b3d38aaaf2693c65b40764b2b55e4e6427796ba90a0583017aeeb32e"} Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.476177 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-4k2tg" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.478967 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.479044 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.493556 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.495323 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8zqdw" event={"ID":"6fa0d50b-d592-4264-aedc-a407f94125be","Type":"ContainerDied","Data":"136e9464f4e5ed2bc63ecbd6cdc98ac2ffde6f449f624af2e4b60d472afcebe7"} Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.495381 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.495408 4740 scope.go:117] "RemoveContainer" containerID="c5a8859f52e10bf94b4006029713a6d0e270865ed8e2a0791cf9b98672a20063" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.498469 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.569544 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.575272 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73a11218-32c1-4b40-a738-f56e795904d7-lock\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.575385 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.575673 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73a11218-32c1-4b40-a738-f56e795904d7-cache\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.575738 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddd4m\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-kube-api-access-ddd4m\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.576906 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.596948 4740 scope.go:117] "RemoveContainer" containerID="6da78e4f6dc644a4aeffa7d1e4f8cad6932046da17c0356610ef620262ff7fae" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.653678 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8zqdw"] Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.667745 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8zqdw"] Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.679909 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73a11218-32c1-4b40-a738-f56e795904d7-cache\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.679967 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddd4m\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-kube-api-access-ddd4m\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.680027 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.680080 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73a11218-32c1-4b40-a738-f56e795904d7-lock\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.680103 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: E1009 10:43:49.680215 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 10:43:49 crc kubenswrapper[4740]: E1009 10:43:49.680231 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 10:43:49 crc kubenswrapper[4740]: E1009 10:43:49.680272 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift podName:73a11218-32c1-4b40-a738-f56e795904d7 nodeName:}" failed. No retries permitted until 2025-10-09 10:43:50.180258186 +0000 UTC m=+969.142458567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift") pod "swift-storage-0" (UID: "73a11218-32c1-4b40-a738-f56e795904d7") : configmap "swift-ring-files" not found Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.680516 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.680844 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73a11218-32c1-4b40-a738-f56e795904d7-lock\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.681109 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73a11218-32c1-4b40-a738-f56e795904d7-cache\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.699144 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddd4m\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-kube-api-access-ddd4m\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.701291 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.764071 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa0d50b-d592-4264-aedc-a407f94125be" path="/var/lib/kubelet/pods/6fa0d50b-d592-4264-aedc-a407f94125be/volumes" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.951813 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vzk5q"] Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.953408 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.955189 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.955255 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.955447 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.976458 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vzk5q"] Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.986773 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-combined-ca-bundle\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.986879 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-ring-data-devices\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.986902 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-dispersionconf\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.986931 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-swiftconf\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.986960 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rrc9\" (UniqueName: \"kubernetes.io/projected/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-kube-api-access-7rrc9\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.986983 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-scripts\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:49 crc kubenswrapper[4740]: I1009 10:43:49.987021 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-etc-swift\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.088607 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-swiftconf\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.088680 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rrc9\" (UniqueName: \"kubernetes.io/projected/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-kube-api-access-7rrc9\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.088715 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-scripts\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.088796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-etc-swift\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.088819 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-combined-ca-bundle\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.088920 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-ring-data-devices\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.088959 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-dispersionconf\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.089657 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-scripts\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.089873 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-etc-swift\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.090242 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-ring-data-devices\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.093803 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-dispersionconf\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.094232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-combined-ca-bundle\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.094792 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-swiftconf\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.108063 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rrc9\" (UniqueName: \"kubernetes.io/projected/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-kube-api-access-7rrc9\") pod \"swift-ring-rebalance-vzk5q\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.191146 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:50 crc kubenswrapper[4740]: E1009 10:43:50.191304 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 10:43:50 crc kubenswrapper[4740]: E1009 10:43:50.191333 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 10:43:50 crc kubenswrapper[4740]: E1009 10:43:50.191390 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift podName:73a11218-32c1-4b40-a738-f56e795904d7 nodeName:}" failed. No retries permitted until 2025-10-09 10:43:51.191373703 +0000 UTC m=+970.153574084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift") pod "swift-storage-0" (UID: "73a11218-32c1-4b40-a738-f56e795904d7") : configmap "swift-ring-files" not found Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.273309 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.502100 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xfmk7" event={"ID":"15e8a2fd-eacd-4108-af4c-355e0e923d2d","Type":"ContainerStarted","Data":"5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874"} Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.502531 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.529246 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-xfmk7" podStartSLOduration=2.529225915 podStartE2EDuration="2.529225915s" podCreationTimestamp="2025-10-09 10:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:43:50.520544986 +0000 UTC m=+969.482745367" watchObservedRunningTime="2025-10-09 10:43:50.529225915 +0000 UTC m=+969.491426316" Oct 09 10:43:50 crc kubenswrapper[4740]: I1009 10:43:50.725198 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vzk5q"] Oct 09 10:43:50 crc kubenswrapper[4740]: W1009 10:43:50.738778 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebeb8396_40be_4400_8a2f_d1cdeb8c20e4.slice/crio-25f95111bb686d446bcfc3908ed8ed5eadc27b9defce972bd444475d226ada2c WatchSource:0}: Error finding container 25f95111bb686d446bcfc3908ed8ed5eadc27b9defce972bd444475d226ada2c: Status 404 returned error can't find the container with id 25f95111bb686d446bcfc3908ed8ed5eadc27b9defce972bd444475d226ada2c Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.171904 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-77xsq"] Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.173219 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-77xsq" Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.185913 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-77xsq"] Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.207868 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtsd\" (UniqueName: \"kubernetes.io/projected/cb042060-39cf-4d73-be53-aa20360e48f1-kube-api-access-6gtsd\") pod \"glance-db-create-77xsq\" (UID: \"cb042060-39cf-4d73-be53-aa20360e48f1\") " pod="openstack/glance-db-create-77xsq" Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.208078 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:51 crc kubenswrapper[4740]: E1009 10:43:51.208273 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 10:43:51 crc kubenswrapper[4740]: E1009 10:43:51.208311 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 10:43:51 crc kubenswrapper[4740]: E1009 10:43:51.208377 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift podName:73a11218-32c1-4b40-a738-f56e795904d7 nodeName:}" failed. No retries permitted until 2025-10-09 10:43:53.208356772 +0000 UTC m=+972.170557173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift") pod "swift-storage-0" (UID: "73a11218-32c1-4b40-a738-f56e795904d7") : configmap "swift-ring-files" not found Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.309950 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtsd\" (UniqueName: \"kubernetes.io/projected/cb042060-39cf-4d73-be53-aa20360e48f1-kube-api-access-6gtsd\") pod \"glance-db-create-77xsq\" (UID: \"cb042060-39cf-4d73-be53-aa20360e48f1\") " pod="openstack/glance-db-create-77xsq" Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.336490 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtsd\" (UniqueName: \"kubernetes.io/projected/cb042060-39cf-4d73-be53-aa20360e48f1-kube-api-access-6gtsd\") pod \"glance-db-create-77xsq\" (UID: \"cb042060-39cf-4d73-be53-aa20360e48f1\") " pod="openstack/glance-db-create-77xsq" Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.491692 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-77xsq" Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.510551 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vzk5q" event={"ID":"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4","Type":"ContainerStarted","Data":"25f95111bb686d446bcfc3908ed8ed5eadc27b9defce972bd444475d226ada2c"} Oct 09 10:43:51 crc kubenswrapper[4740]: I1009 10:43:51.957429 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-77xsq"] Oct 09 10:43:51 crc kubenswrapper[4740]: W1009 10:43:51.975072 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb042060_39cf_4d73_be53_aa20360e48f1.slice/crio-d5035690515a0a23c3c06c304020823c1c0d26303e29134a3024a1aad0c8e4a3 WatchSource:0}: Error finding container d5035690515a0a23c3c06c304020823c1c0d26303e29134a3024a1aad0c8e4a3: Status 404 returned error can't find the container with id d5035690515a0a23c3c06c304020823c1c0d26303e29134a3024a1aad0c8e4a3 Oct 09 10:43:52 crc kubenswrapper[4740]: I1009 10:43:52.521405 4740 generic.go:334] "Generic (PLEG): container finished" podID="cb042060-39cf-4d73-be53-aa20360e48f1" containerID="250a88eff5719440e2f3afca5e0f8af2fb1e6743e20cec7c527db6d28f462c40" exitCode=0 Oct 09 10:43:52 crc kubenswrapper[4740]: I1009 10:43:52.521456 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-77xsq" event={"ID":"cb042060-39cf-4d73-be53-aa20360e48f1","Type":"ContainerDied","Data":"250a88eff5719440e2f3afca5e0f8af2fb1e6743e20cec7c527db6d28f462c40"} Oct 09 10:43:52 crc kubenswrapper[4740]: I1009 10:43:52.521483 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-77xsq" event={"ID":"cb042060-39cf-4d73-be53-aa20360e48f1","Type":"ContainerStarted","Data":"d5035690515a0a23c3c06c304020823c1c0d26303e29134a3024a1aad0c8e4a3"} Oct 09 10:43:53 crc kubenswrapper[4740]: I1009 10:43:53.248467 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:53 crc kubenswrapper[4740]: E1009 10:43:53.248688 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 10:43:53 crc kubenswrapper[4740]: E1009 10:43:53.248926 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 10:43:53 crc kubenswrapper[4740]: E1009 10:43:53.248979 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift podName:73a11218-32c1-4b40-a738-f56e795904d7 nodeName:}" failed. No retries permitted until 2025-10-09 10:43:57.248965395 +0000 UTC m=+976.211165776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift") pod "swift-storage-0" (UID: "73a11218-32c1-4b40-a738-f56e795904d7") : configmap "swift-ring-files" not found Oct 09 10:43:54 crc kubenswrapper[4740]: I1009 10:43:54.462558 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-77xsq" Oct 09 10:43:54 crc kubenswrapper[4740]: I1009 10:43:54.541872 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-77xsq" event={"ID":"cb042060-39cf-4d73-be53-aa20360e48f1","Type":"ContainerDied","Data":"d5035690515a0a23c3c06c304020823c1c0d26303e29134a3024a1aad0c8e4a3"} Oct 09 10:43:54 crc kubenswrapper[4740]: I1009 10:43:54.542079 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5035690515a0a23c3c06c304020823c1c0d26303e29134a3024a1aad0c8e4a3" Oct 09 10:43:54 crc kubenswrapper[4740]: I1009 10:43:54.541933 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-77xsq" Oct 09 10:43:54 crc kubenswrapper[4740]: I1009 10:43:54.566545 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gtsd\" (UniqueName: \"kubernetes.io/projected/cb042060-39cf-4d73-be53-aa20360e48f1-kube-api-access-6gtsd\") pod \"cb042060-39cf-4d73-be53-aa20360e48f1\" (UID: \"cb042060-39cf-4d73-be53-aa20360e48f1\") " Oct 09 10:43:54 crc kubenswrapper[4740]: I1009 10:43:54.575318 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb042060-39cf-4d73-be53-aa20360e48f1-kube-api-access-6gtsd" (OuterVolumeSpecName: "kube-api-access-6gtsd") pod "cb042060-39cf-4d73-be53-aa20360e48f1" (UID: "cb042060-39cf-4d73-be53-aa20360e48f1"). InnerVolumeSpecName "kube-api-access-6gtsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:54 crc kubenswrapper[4740]: I1009 10:43:54.669162 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gtsd\" (UniqueName: \"kubernetes.io/projected/cb042060-39cf-4d73-be53-aa20360e48f1-kube-api-access-6gtsd\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.557918 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-clpxk"] Oct 09 10:43:55 crc kubenswrapper[4740]: E1009 10:43:55.559797 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb042060-39cf-4d73-be53-aa20360e48f1" containerName="mariadb-database-create" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.559825 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb042060-39cf-4d73-be53-aa20360e48f1" containerName="mariadb-database-create" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.560171 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb042060-39cf-4d73-be53-aa20360e48f1" containerName="mariadb-database-create" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.562123 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vzk5q" event={"ID":"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4","Type":"ContainerStarted","Data":"ec0607f8d70577c54468813a5b04688c690975029b158d7708ead400c7591c31"} Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.562780 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-clpxk" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.565602 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-clpxk"] Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.583169 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vzk5q" podStartSLOduration=2.972625461 podStartE2EDuration="6.58315484s" podCreationTimestamp="2025-10-09 10:43:49 +0000 UTC" firstStartedPulling="2025-10-09 10:43:50.744915556 +0000 UTC m=+969.707115937" lastFinishedPulling="2025-10-09 10:43:54.355444925 +0000 UTC m=+973.317645316" observedRunningTime="2025-10-09 10:43:55.58062916 +0000 UTC m=+974.542829581" watchObservedRunningTime="2025-10-09 10:43:55.58315484 +0000 UTC m=+974.545355221" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.685994 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7lb\" (UniqueName: \"kubernetes.io/projected/34de7ce0-3d21-4ea0-9cbb-377ae365f423-kube-api-access-fp7lb\") pod \"keystone-db-create-clpxk\" (UID: \"34de7ce0-3d21-4ea0-9cbb-377ae365f423\") " pod="openstack/keystone-db-create-clpxk" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.787631 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7lb\" (UniqueName: \"kubernetes.io/projected/34de7ce0-3d21-4ea0-9cbb-377ae365f423-kube-api-access-fp7lb\") pod \"keystone-db-create-clpxk\" (UID: \"34de7ce0-3d21-4ea0-9cbb-377ae365f423\") " pod="openstack/keystone-db-create-clpxk" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.804058 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7lb\" (UniqueName: \"kubernetes.io/projected/34de7ce0-3d21-4ea0-9cbb-377ae365f423-kube-api-access-fp7lb\") pod \"keystone-db-create-clpxk\" (UID: \"34de7ce0-3d21-4ea0-9cbb-377ae365f423\") " pod="openstack/keystone-db-create-clpxk" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.872347 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-f7twr"] Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.876573 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f7twr" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.881933 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f7twr"] Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.887065 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-clpxk" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.890401 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw89j\" (UniqueName: \"kubernetes.io/projected/50c526e9-512f-4541-9c40-dbe246d4afa9-kube-api-access-vw89j\") pod \"placement-db-create-f7twr\" (UID: \"50c526e9-512f-4541-9c40-dbe246d4afa9\") " pod="openstack/placement-db-create-f7twr" Oct 09 10:43:55 crc kubenswrapper[4740]: I1009 10:43:55.994973 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw89j\" (UniqueName: \"kubernetes.io/projected/50c526e9-512f-4541-9c40-dbe246d4afa9-kube-api-access-vw89j\") pod \"placement-db-create-f7twr\" (UID: \"50c526e9-512f-4541-9c40-dbe246d4afa9\") " pod="openstack/placement-db-create-f7twr" Oct 09 10:43:56 crc kubenswrapper[4740]: I1009 10:43:56.017323 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw89j\" (UniqueName: \"kubernetes.io/projected/50c526e9-512f-4541-9c40-dbe246d4afa9-kube-api-access-vw89j\") pod \"placement-db-create-f7twr\" (UID: \"50c526e9-512f-4541-9c40-dbe246d4afa9\") " pod="openstack/placement-db-create-f7twr" Oct 09 10:43:56 crc kubenswrapper[4740]: I1009 10:43:56.207240 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f7twr" Oct 09 10:43:56 crc kubenswrapper[4740]: I1009 10:43:56.372008 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-clpxk"] Oct 09 10:43:56 crc kubenswrapper[4740]: I1009 10:43:56.574180 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-clpxk" event={"ID":"34de7ce0-3d21-4ea0-9cbb-377ae365f423","Type":"ContainerStarted","Data":"17c61b6deb7c03c82d02128991d6f3d6e679b8c28d2c7fea29077b59210c0d8f"} Oct 09 10:43:56 crc kubenswrapper[4740]: I1009 10:43:56.574224 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-clpxk" event={"ID":"34de7ce0-3d21-4ea0-9cbb-377ae365f423","Type":"ContainerStarted","Data":"3af7797b9923a0bd64eaec135c3457e6247b1c538d40e8bc9c53cdf3b4fbe2cc"} Oct 09 10:43:56 crc kubenswrapper[4740]: I1009 10:43:56.597958 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-clpxk" podStartSLOduration=1.5979384890000001 podStartE2EDuration="1.597938489s" podCreationTimestamp="2025-10-09 10:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:43:56.589441395 +0000 UTC m=+975.551641776" watchObservedRunningTime="2025-10-09 10:43:56.597938489 +0000 UTC m=+975.560138880" Oct 09 10:43:56 crc kubenswrapper[4740]: I1009 10:43:56.676196 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f7twr"] Oct 09 10:43:56 crc kubenswrapper[4740]: W1009 10:43:56.700587 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50c526e9_512f_4541_9c40_dbe246d4afa9.slice/crio-62451f65ad88b4bc57b9d36d86afb51a02f805245a3af06d626bb30bb89958a0 WatchSource:0}: Error finding container 62451f65ad88b4bc57b9d36d86afb51a02f805245a3af06d626bb30bb89958a0: Status 404 returned error can't find the container with id 62451f65ad88b4bc57b9d36d86afb51a02f805245a3af06d626bb30bb89958a0 Oct 09 10:43:57 crc kubenswrapper[4740]: I1009 10:43:57.320515 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:43:57 crc kubenswrapper[4740]: E1009 10:43:57.320820 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 10:43:57 crc kubenswrapper[4740]: E1009 10:43:57.321033 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 10:43:57 crc kubenswrapper[4740]: E1009 10:43:57.321112 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift podName:73a11218-32c1-4b40-a738-f56e795904d7 nodeName:}" failed. No retries permitted until 2025-10-09 10:44:05.321087548 +0000 UTC m=+984.283287929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift") pod "swift-storage-0" (UID: "73a11218-32c1-4b40-a738-f56e795904d7") : configmap "swift-ring-files" not found Oct 09 10:43:57 crc kubenswrapper[4740]: I1009 10:43:57.583100 4740 generic.go:334] "Generic (PLEG): container finished" podID="34de7ce0-3d21-4ea0-9cbb-377ae365f423" containerID="17c61b6deb7c03c82d02128991d6f3d6e679b8c28d2c7fea29077b59210c0d8f" exitCode=0 Oct 09 10:43:57 crc kubenswrapper[4740]: I1009 10:43:57.583192 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-clpxk" event={"ID":"34de7ce0-3d21-4ea0-9cbb-377ae365f423","Type":"ContainerDied","Data":"17c61b6deb7c03c82d02128991d6f3d6e679b8c28d2c7fea29077b59210c0d8f"} Oct 09 10:43:57 crc kubenswrapper[4740]: I1009 10:43:57.587414 4740 generic.go:334] "Generic (PLEG): container finished" podID="50c526e9-512f-4541-9c40-dbe246d4afa9" containerID="4bc58709a36d78e199b1fee4e2c77ebd1fe522c0a69fa6c7acfb3d8f5694a177" exitCode=0 Oct 09 10:43:57 crc kubenswrapper[4740]: I1009 10:43:57.587466 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f7twr" event={"ID":"50c526e9-512f-4541-9c40-dbe246d4afa9","Type":"ContainerDied","Data":"4bc58709a36d78e199b1fee4e2c77ebd1fe522c0a69fa6c7acfb3d8f5694a177"} Oct 09 10:43:57 crc kubenswrapper[4740]: I1009 10:43:57.587659 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f7twr" event={"ID":"50c526e9-512f-4541-9c40-dbe246d4afa9","Type":"ContainerStarted","Data":"62451f65ad88b4bc57b9d36d86afb51a02f805245a3af06d626bb30bb89958a0"} Oct 09 10:43:58 crc kubenswrapper[4740]: I1009 10:43:58.621321 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:43:58 crc kubenswrapper[4740]: I1009 10:43:58.704935 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xdk5b"] Oct 09 10:43:58 crc kubenswrapper[4740]: I1009 10:43:58.705150 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" podUID="18483b6e-9941-4d1c-af6e-2812758e0265" containerName="dnsmasq-dns" containerID="cri-o://f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500" gracePeriod=10 Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.099296 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-clpxk" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.123967 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f7twr" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.202649 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.272218 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw89j\" (UniqueName: \"kubernetes.io/projected/50c526e9-512f-4541-9c40-dbe246d4afa9-kube-api-access-vw89j\") pod \"50c526e9-512f-4541-9c40-dbe246d4afa9\" (UID: \"50c526e9-512f-4541-9c40-dbe246d4afa9\") " Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.272609 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp7lb\" (UniqueName: \"kubernetes.io/projected/34de7ce0-3d21-4ea0-9cbb-377ae365f423-kube-api-access-fp7lb\") pod \"34de7ce0-3d21-4ea0-9cbb-377ae365f423\" (UID: \"34de7ce0-3d21-4ea0-9cbb-377ae365f423\") " Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.278466 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34de7ce0-3d21-4ea0-9cbb-377ae365f423-kube-api-access-fp7lb" (OuterVolumeSpecName: "kube-api-access-fp7lb") pod "34de7ce0-3d21-4ea0-9cbb-377ae365f423" (UID: "34de7ce0-3d21-4ea0-9cbb-377ae365f423"). InnerVolumeSpecName "kube-api-access-fp7lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.278725 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c526e9-512f-4541-9c40-dbe246d4afa9-kube-api-access-vw89j" (OuterVolumeSpecName: "kube-api-access-vw89j") pod "50c526e9-512f-4541-9c40-dbe246d4afa9" (UID: "50c526e9-512f-4541-9c40-dbe246d4afa9"). InnerVolumeSpecName "kube-api-access-vw89j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.373944 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-dns-svc\") pod \"18483b6e-9941-4d1c-af6e-2812758e0265\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.374020 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xdrv\" (UniqueName: \"kubernetes.io/projected/18483b6e-9941-4d1c-af6e-2812758e0265-kube-api-access-6xdrv\") pod \"18483b6e-9941-4d1c-af6e-2812758e0265\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.374120 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-config\") pod \"18483b6e-9941-4d1c-af6e-2812758e0265\" (UID: \"18483b6e-9941-4d1c-af6e-2812758e0265\") " Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.374469 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp7lb\" (UniqueName: \"kubernetes.io/projected/34de7ce0-3d21-4ea0-9cbb-377ae365f423-kube-api-access-fp7lb\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.374486 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw89j\" (UniqueName: \"kubernetes.io/projected/50c526e9-512f-4541-9c40-dbe246d4afa9-kube-api-access-vw89j\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.376957 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18483b6e-9941-4d1c-af6e-2812758e0265-kube-api-access-6xdrv" (OuterVolumeSpecName: "kube-api-access-6xdrv") pod "18483b6e-9941-4d1c-af6e-2812758e0265" (UID: "18483b6e-9941-4d1c-af6e-2812758e0265"). InnerVolumeSpecName "kube-api-access-6xdrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.408000 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-config" (OuterVolumeSpecName: "config") pod "18483b6e-9941-4d1c-af6e-2812758e0265" (UID: "18483b6e-9941-4d1c-af6e-2812758e0265"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.414712 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18483b6e-9941-4d1c-af6e-2812758e0265" (UID: "18483b6e-9941-4d1c-af6e-2812758e0265"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.475968 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.476001 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18483b6e-9941-4d1c-af6e-2812758e0265-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.476014 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xdrv\" (UniqueName: \"kubernetes.io/projected/18483b6e-9941-4d1c-af6e-2812758e0265-kube-api-access-6xdrv\") on node \"crc\" DevicePath \"\"" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.608960 4740 generic.go:334] "Generic (PLEG): container finished" podID="18483b6e-9941-4d1c-af6e-2812758e0265" containerID="f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500" exitCode=0 Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.609026 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" event={"ID":"18483b6e-9941-4d1c-af6e-2812758e0265","Type":"ContainerDied","Data":"f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500"} Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.609032 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.609077 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xdk5b" event={"ID":"18483b6e-9941-4d1c-af6e-2812758e0265","Type":"ContainerDied","Data":"8609c5b0f024f710a56eab9e6b6ac024a38facdb11427f3932e5761144cc8a7b"} Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.609100 4740 scope.go:117] "RemoveContainer" containerID="f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.610495 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-clpxk" event={"ID":"34de7ce0-3d21-4ea0-9cbb-377ae365f423","Type":"ContainerDied","Data":"3af7797b9923a0bd64eaec135c3457e6247b1c538d40e8bc9c53cdf3b4fbe2cc"} Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.610512 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-clpxk" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.610525 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3af7797b9923a0bd64eaec135c3457e6247b1c538d40e8bc9c53cdf3b4fbe2cc" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.612496 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f7twr" event={"ID":"50c526e9-512f-4541-9c40-dbe246d4afa9","Type":"ContainerDied","Data":"62451f65ad88b4bc57b9d36d86afb51a02f805245a3af06d626bb30bb89958a0"} Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.612528 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62451f65ad88b4bc57b9d36d86afb51a02f805245a3af06d626bb30bb89958a0" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.612528 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f7twr" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.638390 4740 scope.go:117] "RemoveContainer" containerID="eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.655398 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xdk5b"] Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.660230 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xdk5b"] Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.664065 4740 scope.go:117] "RemoveContainer" containerID="f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500" Oct 09 10:43:59 crc kubenswrapper[4740]: E1009 10:43:59.664460 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500\": container with ID starting with f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500 not found: ID does not exist" containerID="f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.664490 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500"} err="failed to get container status \"f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500\": rpc error: code = NotFound desc = could not find container \"f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500\": container with ID starting with f9eb71e9fec1337e96de8f69121a038a4155d477d6413e3172a8f15cab535500 not found: ID does not exist" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.664509 4740 scope.go:117] "RemoveContainer" containerID="eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13" Oct 09 10:43:59 crc kubenswrapper[4740]: E1009 10:43:59.664792 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13\": container with ID starting with eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13 not found: ID does not exist" containerID="eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.664815 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13"} err="failed to get container status \"eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13\": rpc error: code = NotFound desc = could not find container \"eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13\": container with ID starting with eb166ac4b8be2c92d6a7502fa4c7bb6669b9624366d412ee7f6aae9c916e0d13 not found: ID does not exist" Oct 09 10:43:59 crc kubenswrapper[4740]: I1009 10:43:59.765019 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18483b6e-9941-4d1c-af6e-2812758e0265" path="/var/lib/kubelet/pods/18483b6e-9941-4d1c-af6e-2812758e0265/volumes" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.025914 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.275995 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-fc7c-account-create-7hr86"] Oct 09 10:44:01 crc kubenswrapper[4740]: E1009 10:44:01.276401 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34de7ce0-3d21-4ea0-9cbb-377ae365f423" containerName="mariadb-database-create" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.276426 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="34de7ce0-3d21-4ea0-9cbb-377ae365f423" containerName="mariadb-database-create" Oct 09 10:44:01 crc kubenswrapper[4740]: E1009 10:44:01.276450 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c526e9-512f-4541-9c40-dbe246d4afa9" containerName="mariadb-database-create" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.276458 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c526e9-512f-4541-9c40-dbe246d4afa9" containerName="mariadb-database-create" Oct 09 10:44:01 crc kubenswrapper[4740]: E1009 10:44:01.276475 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18483b6e-9941-4d1c-af6e-2812758e0265" containerName="init" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.276484 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="18483b6e-9941-4d1c-af6e-2812758e0265" containerName="init" Oct 09 10:44:01 crc kubenswrapper[4740]: E1009 10:44:01.276502 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18483b6e-9941-4d1c-af6e-2812758e0265" containerName="dnsmasq-dns" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.276510 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="18483b6e-9941-4d1c-af6e-2812758e0265" containerName="dnsmasq-dns" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.276696 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c526e9-512f-4541-9c40-dbe246d4afa9" containerName="mariadb-database-create" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.276723 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="34de7ce0-3d21-4ea0-9cbb-377ae365f423" containerName="mariadb-database-create" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.276735 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="18483b6e-9941-4d1c-af6e-2812758e0265" containerName="dnsmasq-dns" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.277384 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc7c-account-create-7hr86" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.281376 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.285059 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fc7c-account-create-7hr86"] Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.406044 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wplj\" (UniqueName: \"kubernetes.io/projected/1c063f94-2867-463a-a5e7-a436d38ebb1d-kube-api-access-4wplj\") pod \"glance-fc7c-account-create-7hr86\" (UID: \"1c063f94-2867-463a-a5e7-a436d38ebb1d\") " pod="openstack/glance-fc7c-account-create-7hr86" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.508280 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wplj\" (UniqueName: \"kubernetes.io/projected/1c063f94-2867-463a-a5e7-a436d38ebb1d-kube-api-access-4wplj\") pod \"glance-fc7c-account-create-7hr86\" (UID: \"1c063f94-2867-463a-a5e7-a436d38ebb1d\") " pod="openstack/glance-fc7c-account-create-7hr86" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.532448 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wplj\" (UniqueName: \"kubernetes.io/projected/1c063f94-2867-463a-a5e7-a436d38ebb1d-kube-api-access-4wplj\") pod \"glance-fc7c-account-create-7hr86\" (UID: \"1c063f94-2867-463a-a5e7-a436d38ebb1d\") " pod="openstack/glance-fc7c-account-create-7hr86" Oct 09 10:44:01 crc kubenswrapper[4740]: I1009 10:44:01.595225 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc7c-account-create-7hr86" Oct 09 10:44:02 crc kubenswrapper[4740]: I1009 10:44:02.131985 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fc7c-account-create-7hr86"] Oct 09 10:44:02 crc kubenswrapper[4740]: I1009 10:44:02.644609 4740 generic.go:334] "Generic (PLEG): container finished" podID="ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" containerID="ec0607f8d70577c54468813a5b04688c690975029b158d7708ead400c7591c31" exitCode=0 Oct 09 10:44:02 crc kubenswrapper[4740]: I1009 10:44:02.644783 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vzk5q" event={"ID":"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4","Type":"ContainerDied","Data":"ec0607f8d70577c54468813a5b04688c690975029b158d7708ead400c7591c31"} Oct 09 10:44:02 crc kubenswrapper[4740]: I1009 10:44:02.647697 4740 generic.go:334] "Generic (PLEG): container finished" podID="1c063f94-2867-463a-a5e7-a436d38ebb1d" containerID="68c121cb1b9b5d61067946f21aaa382efbedb856bac38ee5f18dbfdb60ffc68a" exitCode=0 Oct 09 10:44:02 crc kubenswrapper[4740]: I1009 10:44:02.647805 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc7c-account-create-7hr86" event={"ID":"1c063f94-2867-463a-a5e7-a436d38ebb1d","Type":"ContainerDied","Data":"68c121cb1b9b5d61067946f21aaa382efbedb856bac38ee5f18dbfdb60ffc68a"} Oct 09 10:44:02 crc kubenswrapper[4740]: I1009 10:44:02.647866 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc7c-account-create-7hr86" event={"ID":"1c063f94-2867-463a-a5e7-a436d38ebb1d","Type":"ContainerStarted","Data":"59a9259e1ec006fb1487948a8e125b3da1849226c98926c1a1daa85e4a4fbe06"} Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.083720 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc7c-account-create-7hr86" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.092886 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.258213 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-scripts\") pod \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.258276 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-combined-ca-bundle\") pod \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.258307 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wplj\" (UniqueName: \"kubernetes.io/projected/1c063f94-2867-463a-a5e7-a436d38ebb1d-kube-api-access-4wplj\") pod \"1c063f94-2867-463a-a5e7-a436d38ebb1d\" (UID: \"1c063f94-2867-463a-a5e7-a436d38ebb1d\") " Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.258343 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-swiftconf\") pod \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.258421 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rrc9\" (UniqueName: \"kubernetes.io/projected/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-kube-api-access-7rrc9\") pod \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.258510 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-etc-swift\") pod \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.258563 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-dispersionconf\") pod \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.258586 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-ring-data-devices\") pod \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\" (UID: \"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4\") " Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.260279 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" (UID: "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.261606 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" (UID: "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.267705 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c063f94-2867-463a-a5e7-a436d38ebb1d-kube-api-access-4wplj" (OuterVolumeSpecName: "kube-api-access-4wplj") pod "1c063f94-2867-463a-a5e7-a436d38ebb1d" (UID: "1c063f94-2867-463a-a5e7-a436d38ebb1d"). InnerVolumeSpecName "kube-api-access-4wplj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.268373 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-kube-api-access-7rrc9" (OuterVolumeSpecName: "kube-api-access-7rrc9") pod "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" (UID: "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4"). InnerVolumeSpecName "kube-api-access-7rrc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.270041 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" (UID: "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.285553 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" (UID: "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.287716 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" (UID: "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.300250 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-scripts" (OuterVolumeSpecName: "scripts") pod "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" (UID: "ebeb8396-40be-4400-8a2f-d1cdeb8c20e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.361148 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rrc9\" (UniqueName: \"kubernetes.io/projected/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-kube-api-access-7rrc9\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.361203 4740 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.361229 4740 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.361253 4740 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.361276 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.361298 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.361325 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wplj\" (UniqueName: \"kubernetes.io/projected/1c063f94-2867-463a-a5e7-a436d38ebb1d-kube-api-access-4wplj\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.361346 4740 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ebeb8396-40be-4400-8a2f-d1cdeb8c20e4-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.667524 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc7c-account-create-7hr86" event={"ID":"1c063f94-2867-463a-a5e7-a436d38ebb1d","Type":"ContainerDied","Data":"59a9259e1ec006fb1487948a8e125b3da1849226c98926c1a1daa85e4a4fbe06"} Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.667875 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59a9259e1ec006fb1487948a8e125b3da1849226c98926c1a1daa85e4a4fbe06" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.667585 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc7c-account-create-7hr86" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.669826 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vzk5q" event={"ID":"ebeb8396-40be-4400-8a2f-d1cdeb8c20e4","Type":"ContainerDied","Data":"25f95111bb686d446bcfc3908ed8ed5eadc27b9defce972bd444475d226ada2c"} Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.669878 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f95111bb686d446bcfc3908ed8ed5eadc27b9defce972bd444475d226ada2c" Oct 09 10:44:04 crc kubenswrapper[4740]: I1009 10:44:04.670393 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vzk5q" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.376846 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.388483 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a11218-32c1-4b40-a738-f56e795904d7-etc-swift\") pod \"swift-storage-0\" (UID: \"73a11218-32c1-4b40-a738-f56e795904d7\") " pod="openstack/swift-storage-0" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.587332 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.685044 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0fdb-account-create-7jl66"] Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.685339 4740 generic.go:334] "Generic (PLEG): container finished" podID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerID="cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c" exitCode=0 Oct 09 10:44:05 crc kubenswrapper[4740]: E1009 10:44:05.685479 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c063f94-2867-463a-a5e7-a436d38ebb1d" containerName="mariadb-account-create" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.685506 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c063f94-2867-463a-a5e7-a436d38ebb1d" containerName="mariadb-account-create" Oct 09 10:44:05 crc kubenswrapper[4740]: E1009 10:44:05.685557 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" containerName="swift-ring-rebalance" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.685573 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" containerName="swift-ring-rebalance" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.685882 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c063f94-2867-463a-a5e7-a436d38ebb1d" containerName="mariadb-account-create" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.685911 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebeb8396-40be-4400-8a2f-d1cdeb8c20e4" containerName="swift-ring-rebalance" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.686651 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa98dfc6-da2e-42b0-a620-a07230e1833d","Type":"ContainerDied","Data":"cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c"} Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.686787 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fdb-account-create-7jl66" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.689338 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.690535 4740 generic.go:334] "Generic (PLEG): container finished" podID="187134d2-2fe9-4beb-beff-6a48162a1933" containerID="bcd7f5081393f9b0fd83b07b79c9fd1569cb832e594a750001a732dca196c1c0" exitCode=0 Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.690611 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"187134d2-2fe9-4beb-beff-6a48162a1933","Type":"ContainerDied","Data":"bcd7f5081393f9b0fd83b07b79c9fd1569cb832e594a750001a732dca196c1c0"} Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.693623 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0fdb-account-create-7jl66"] Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.885470 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftkn\" (UniqueName: \"kubernetes.io/projected/82d7f29f-de12-4aed-95ee-47b80666098b-kube-api-access-kftkn\") pod \"keystone-0fdb-account-create-7jl66\" (UID: \"82d7f29f-de12-4aed-95ee-47b80666098b\") " pod="openstack/keystone-0fdb-account-create-7jl66" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.982196 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d4a2-account-create-kj8v8"] Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.983881 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4a2-account-create-kj8v8" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.987577 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kftkn\" (UniqueName: \"kubernetes.io/projected/82d7f29f-de12-4aed-95ee-47b80666098b-kube-api-access-kftkn\") pod \"keystone-0fdb-account-create-7jl66\" (UID: \"82d7f29f-de12-4aed-95ee-47b80666098b\") " pod="openstack/keystone-0fdb-account-create-7jl66" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.988257 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 09 10:44:05 crc kubenswrapper[4740]: I1009 10:44:05.991326 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d4a2-account-create-kj8v8"] Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.010255 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftkn\" (UniqueName: \"kubernetes.io/projected/82d7f29f-de12-4aed-95ee-47b80666098b-kube-api-access-kftkn\") pod \"keystone-0fdb-account-create-7jl66\" (UID: \"82d7f29f-de12-4aed-95ee-47b80666098b\") " pod="openstack/keystone-0fdb-account-create-7jl66" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.088922 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnch4\" (UniqueName: \"kubernetes.io/projected/1de7a924-745d-40e0-8358-271bb7034f87-kube-api-access-tnch4\") pod \"placement-d4a2-account-create-kj8v8\" (UID: \"1de7a924-745d-40e0-8358-271bb7034f87\") " pod="openstack/placement-d4a2-account-create-kj8v8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.107523 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fdb-account-create-7jl66" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.155307 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.195260 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnch4\" (UniqueName: \"kubernetes.io/projected/1de7a924-745d-40e0-8358-271bb7034f87-kube-api-access-tnch4\") pod \"placement-d4a2-account-create-kj8v8\" (UID: \"1de7a924-745d-40e0-8358-271bb7034f87\") " pod="openstack/placement-d4a2-account-create-kj8v8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.216670 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnch4\" (UniqueName: \"kubernetes.io/projected/1de7a924-745d-40e0-8358-271bb7034f87-kube-api-access-tnch4\") pod \"placement-d4a2-account-create-kj8v8\" (UID: \"1de7a924-745d-40e0-8358-271bb7034f87\") " pod="openstack/placement-d4a2-account-create-kj8v8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.309919 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4a2-account-create-kj8v8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.319623 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-c6rld" podUID="7f56ff38-de3a-4c48-8fc0-43e0eac26c55" containerName="ovn-controller" probeResult="failure" output=< Oct 09 10:44:06 crc kubenswrapper[4740]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 10:44:06 crc kubenswrapper[4740]: > Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.435779 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-cpnq8"] Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.436692 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.442998 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.444461 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gdptk" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.461926 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cpnq8"] Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.523864 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0fdb-account-create-7jl66"] Oct 09 10:44:06 crc kubenswrapper[4740]: W1009 10:44:06.528228 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82d7f29f_de12_4aed_95ee_47b80666098b.slice/crio-75f2dd667febaca2fd2f9b8b3bda077efa7b3c53b26ccd8c9625d3f4dd9992b8 WatchSource:0}: Error finding container 75f2dd667febaca2fd2f9b8b3bda077efa7b3c53b26ccd8c9625d3f4dd9992b8: Status 404 returned error can't find the container with id 75f2dd667febaca2fd2f9b8b3bda077efa7b3c53b26ccd8c9625d3f4dd9992b8 Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.605700 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-db-sync-config-data\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.606170 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mspnf\" (UniqueName: \"kubernetes.io/projected/9c004cb4-8052-425c-ac2e-11159a708cad-kube-api-access-mspnf\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.606241 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-combined-ca-bundle\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.606294 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-config-data\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.708162 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"5c2dd55392b290dbe7b0f68f9120e2360b2b6ea5513c4bdd896cce74f452db16"} Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.708648 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-config-data\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.708724 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-db-sync-config-data\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.708814 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mspnf\" (UniqueName: \"kubernetes.io/projected/9c004cb4-8052-425c-ac2e-11159a708cad-kube-api-access-mspnf\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.708867 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-combined-ca-bundle\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.713285 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-config-data\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.713384 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-db-sync-config-data\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.713530 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-combined-ca-bundle\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.713638 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fdb-account-create-7jl66" event={"ID":"82d7f29f-de12-4aed-95ee-47b80666098b","Type":"ContainerStarted","Data":"75f2dd667febaca2fd2f9b8b3bda077efa7b3c53b26ccd8c9625d3f4dd9992b8"} Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.727445 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mspnf\" (UniqueName: \"kubernetes.io/projected/9c004cb4-8052-425c-ac2e-11159a708cad-kube-api-access-mspnf\") pod \"glance-db-sync-cpnq8\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.771449 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:06 crc kubenswrapper[4740]: I1009 10:44:06.796320 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d4a2-account-create-kj8v8"] Oct 09 10:44:06 crc kubenswrapper[4740]: W1009 10:44:06.811812 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1de7a924_745d_40e0_8358_271bb7034f87.slice/crio-301f5fc83e878a2603ae36bf4ac2ec3b0a4de552752e5210798f9ba2e973e8d6 WatchSource:0}: Error finding container 301f5fc83e878a2603ae36bf4ac2ec3b0a4de552752e5210798f9ba2e973e8d6: Status 404 returned error can't find the container with id 301f5fc83e878a2603ae36bf4ac2ec3b0a4de552752e5210798f9ba2e973e8d6 Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.288639 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cpnq8"] Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.728242 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"187134d2-2fe9-4beb-beff-6a48162a1933","Type":"ContainerStarted","Data":"94f0634b0ed255b557447062d5631f7ff62524a41768d6c0fafad907dce032a4"} Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.728777 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.733378 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa98dfc6-da2e-42b0-a620-a07230e1833d","Type":"ContainerStarted","Data":"01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368"} Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.733591 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.737013 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"1d1f3e8ca7519529aece52bd4e5bad12b742af817a17a6fce6c79b751bd30adf"} Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.739114 4740 generic.go:334] "Generic (PLEG): container finished" podID="1de7a924-745d-40e0-8358-271bb7034f87" containerID="146ae6f0c7b06a3d9faf9183a4eb846527d25f121b313170f7b786d828402864" exitCode=0 Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.739168 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4a2-account-create-kj8v8" event={"ID":"1de7a924-745d-40e0-8358-271bb7034f87","Type":"ContainerDied","Data":"146ae6f0c7b06a3d9faf9183a4eb846527d25f121b313170f7b786d828402864"} Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.739202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4a2-account-create-kj8v8" event={"ID":"1de7a924-745d-40e0-8358-271bb7034f87","Type":"ContainerStarted","Data":"301f5fc83e878a2603ae36bf4ac2ec3b0a4de552752e5210798f9ba2e973e8d6"} Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.740621 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cpnq8" event={"ID":"9c004cb4-8052-425c-ac2e-11159a708cad","Type":"ContainerStarted","Data":"400f178c24606656dc27d9d7d445061afb791034ad4fcd048757b0e77eceac9b"} Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.743615 4740 generic.go:334] "Generic (PLEG): container finished" podID="82d7f29f-de12-4aed-95ee-47b80666098b" containerID="1c1cda55d07490473d5da1be62c95e7b64c81eb399c4c134772c5887bd60efb0" exitCode=0 Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.743649 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fdb-account-create-7jl66" event={"ID":"82d7f29f-de12-4aed-95ee-47b80666098b","Type":"ContainerDied","Data":"1c1cda55d07490473d5da1be62c95e7b64c81eb399c4c134772c5887bd60efb0"} Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.769251 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.347730057 podStartE2EDuration="56.769231438s" podCreationTimestamp="2025-10-09 10:43:11 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.425431818 +0000 UTC m=+942.387632189" lastFinishedPulling="2025-10-09 10:43:30.846933179 +0000 UTC m=+949.809133570" observedRunningTime="2025-10-09 10:44:07.762363059 +0000 UTC m=+986.724563440" watchObservedRunningTime="2025-10-09 10:44:07.769231438 +0000 UTC m=+986.731431819" Oct 09 10:44:07 crc kubenswrapper[4740]: I1009 10:44:07.791106 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.217705561 podStartE2EDuration="57.791088509s" podCreationTimestamp="2025-10-09 10:43:10 +0000 UTC" firstStartedPulling="2025-10-09 10:43:23.022405404 +0000 UTC m=+941.984605785" lastFinishedPulling="2025-10-09 10:43:30.595788352 +0000 UTC m=+949.557988733" observedRunningTime="2025-10-09 10:44:07.785581538 +0000 UTC m=+986.747781939" watchObservedRunningTime="2025-10-09 10:44:07.791088509 +0000 UTC m=+986.753288890" Oct 09 10:44:08 crc kubenswrapper[4740]: I1009 10:44:08.753902 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"37473a7a1cc1da9d1578f3ae39801df2e610b8b23cf40dccd753af62d15fb6ee"} Oct 09 10:44:08 crc kubenswrapper[4740]: I1009 10:44:08.754210 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"5c1da34b9f06d4a750ddcfefd8fc50885046708482e0f580be17273386db3d04"} Oct 09 10:44:08 crc kubenswrapper[4740]: I1009 10:44:08.754222 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"8bf54082a9bef3098a456cc0b046431a73d6daff5aa9acd429fbbe952d00872a"} Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.089946 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fdb-account-create-7jl66" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.263444 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4a2-account-create-kj8v8" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.263932 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kftkn\" (UniqueName: \"kubernetes.io/projected/82d7f29f-de12-4aed-95ee-47b80666098b-kube-api-access-kftkn\") pod \"82d7f29f-de12-4aed-95ee-47b80666098b\" (UID: \"82d7f29f-de12-4aed-95ee-47b80666098b\") " Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.269746 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d7f29f-de12-4aed-95ee-47b80666098b-kube-api-access-kftkn" (OuterVolumeSpecName: "kube-api-access-kftkn") pod "82d7f29f-de12-4aed-95ee-47b80666098b" (UID: "82d7f29f-de12-4aed-95ee-47b80666098b"). InnerVolumeSpecName "kube-api-access-kftkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.365888 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnch4\" (UniqueName: \"kubernetes.io/projected/1de7a924-745d-40e0-8358-271bb7034f87-kube-api-access-tnch4\") pod \"1de7a924-745d-40e0-8358-271bb7034f87\" (UID: \"1de7a924-745d-40e0-8358-271bb7034f87\") " Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.366618 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kftkn\" (UniqueName: \"kubernetes.io/projected/82d7f29f-de12-4aed-95ee-47b80666098b-kube-api-access-kftkn\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.368954 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de7a924-745d-40e0-8358-271bb7034f87-kube-api-access-tnch4" (OuterVolumeSpecName: "kube-api-access-tnch4") pod "1de7a924-745d-40e0-8358-271bb7034f87" (UID: "1de7a924-745d-40e0-8358-271bb7034f87"). InnerVolumeSpecName "kube-api-access-tnch4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.467694 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnch4\" (UniqueName: \"kubernetes.io/projected/1de7a924-745d-40e0-8358-271bb7034f87-kube-api-access-tnch4\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.773405 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4a2-account-create-kj8v8" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.775275 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"8ee68e2759afcadab410dae5a4ac317c34dcecb3ba71829c89ed4c57cb666e4d"} Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.775311 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4a2-account-create-kj8v8" event={"ID":"1de7a924-745d-40e0-8358-271bb7034f87","Type":"ContainerDied","Data":"301f5fc83e878a2603ae36bf4ac2ec3b0a4de552752e5210798f9ba2e973e8d6"} Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.775326 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="301f5fc83e878a2603ae36bf4ac2ec3b0a4de552752e5210798f9ba2e973e8d6" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.776341 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fdb-account-create-7jl66" event={"ID":"82d7f29f-de12-4aed-95ee-47b80666098b","Type":"ContainerDied","Data":"75f2dd667febaca2fd2f9b8b3bda077efa7b3c53b26ccd8c9625d3f4dd9992b8"} Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.776364 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f2dd667febaca2fd2f9b8b3bda077efa7b3c53b26ccd8c9625d3f4dd9992b8" Oct 09 10:44:09 crc kubenswrapper[4740]: I1009 10:44:09.776441 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fdb-account-create-7jl66" Oct 09 10:44:10 crc kubenswrapper[4740]: I1009 10:44:10.790038 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"3f835e33ff9798b280ce60e198cb909acca7f461cc7a62085c7dc1d6dc656d63"} Oct 09 10:44:10 crc kubenswrapper[4740]: I1009 10:44:10.790414 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"d7cf25fb6a0fc9c699e6cfb93c9cfa3480becef620bb02d56ce7e948299f40f8"} Oct 09 10:44:10 crc kubenswrapper[4740]: I1009 10:44:10.790432 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"aa8306cf4aabe8ffb2fee60ffa696c90e7a2e9e5b730976fb3a8620d49f44176"} Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.327500 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-c6rld" podUID="7f56ff38-de3a-4c48-8fc0-43e0eac26c55" containerName="ovn-controller" probeResult="failure" output=< Oct 09 10:44:11 crc kubenswrapper[4740]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 10:44:11 crc kubenswrapper[4740]: > Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.384909 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.390951 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cwdss" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.622821 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c6rld-config-6ckhr"] Oct 09 10:44:11 crc kubenswrapper[4740]: E1009 10:44:11.623723 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d7f29f-de12-4aed-95ee-47b80666098b" containerName="mariadb-account-create" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.623751 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d7f29f-de12-4aed-95ee-47b80666098b" containerName="mariadb-account-create" Oct 09 10:44:11 crc kubenswrapper[4740]: E1009 10:44:11.623792 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de7a924-745d-40e0-8358-271bb7034f87" containerName="mariadb-account-create" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.623802 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de7a924-745d-40e0-8358-271bb7034f87" containerName="mariadb-account-create" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.624222 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d7f29f-de12-4aed-95ee-47b80666098b" containerName="mariadb-account-create" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.624288 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de7a924-745d-40e0-8358-271bb7034f87" containerName="mariadb-account-create" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.624893 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.628689 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.637817 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6rld-config-6ckhr"] Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.707498 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run-ovn\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.707543 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-log-ovn\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.707584 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-scripts\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.707636 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.707737 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-246gk\" (UniqueName: \"kubernetes.io/projected/b7290258-99fc-46a4-9161-2457d4b859f3-kube-api-access-246gk\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.707915 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-additional-scripts\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.809426 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.809702 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.809784 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-246gk\" (UniqueName: \"kubernetes.io/projected/b7290258-99fc-46a4-9161-2457d4b859f3-kube-api-access-246gk\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.810073 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-additional-scripts\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.810139 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run-ovn\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.810154 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-log-ovn\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.810223 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run-ovn\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.810226 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-scripts\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.810294 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-log-ovn\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.810789 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-additional-scripts\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.814909 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-scripts\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.817340 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"ca3fd3567862a20126a8cb9432e54bbbdd475a476c452a20da2dba602749fdf6"} Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.838533 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-246gk\" (UniqueName: \"kubernetes.io/projected/b7290258-99fc-46a4-9161-2457d4b859f3-kube-api-access-246gk\") pod \"ovn-controller-c6rld-config-6ckhr\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:11 crc kubenswrapper[4740]: I1009 10:44:11.981796 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:12 crc kubenswrapper[4740]: I1009 10:44:12.504650 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6rld-config-6ckhr"] Oct 09 10:44:12 crc kubenswrapper[4740]: W1009 10:44:12.528428 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7290258_99fc_46a4_9161_2457d4b859f3.slice/crio-9919c10e10d3efafc51f3d26fed3ff7966dddfb4982e3fa808a4e9a2fa6a923e WatchSource:0}: Error finding container 9919c10e10d3efafc51f3d26fed3ff7966dddfb4982e3fa808a4e9a2fa6a923e: Status 404 returned error can't find the container with id 9919c10e10d3efafc51f3d26fed3ff7966dddfb4982e3fa808a4e9a2fa6a923e Oct 09 10:44:12 crc kubenswrapper[4740]: I1009 10:44:12.828947 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6rld-config-6ckhr" event={"ID":"b7290258-99fc-46a4-9161-2457d4b859f3","Type":"ContainerStarted","Data":"9919c10e10d3efafc51f3d26fed3ff7966dddfb4982e3fa808a4e9a2fa6a923e"} Oct 09 10:44:12 crc kubenswrapper[4740]: I1009 10:44:12.837602 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"9926a5fdf00ebbc1a69b870befd6eee477e0c3661aed5a78c9f4693e5677f0e0"} Oct 09 10:44:12 crc kubenswrapper[4740]: I1009 10:44:12.837649 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"51b3c4892c3e7419562491c71f98d04dac9b956be01530a1a797c84d14f88782"} Oct 09 10:44:12 crc kubenswrapper[4740]: I1009 10:44:12.837659 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"24ac2e3602b1ae1f878e25fe05c17aaa00048101cc09a84f602078787a55581a"} Oct 09 10:44:12 crc kubenswrapper[4740]: I1009 10:44:12.837672 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"fa8805cca5d1dbc453222ba750f0b679c5ee94e0dc2809672ea08354499ad922"} Oct 09 10:44:13 crc kubenswrapper[4740]: I1009 10:44:13.849952 4740 generic.go:334] "Generic (PLEG): container finished" podID="b7290258-99fc-46a4-9161-2457d4b859f3" containerID="bf39e63bc4d955c62d5b2326fc62b360c447c4c352a8032b2f5e14e8ff7a2b5a" exitCode=0 Oct 09 10:44:13 crc kubenswrapper[4740]: I1009 10:44:13.850014 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6rld-config-6ckhr" event={"ID":"b7290258-99fc-46a4-9161-2457d4b859f3","Type":"ContainerDied","Data":"bf39e63bc4d955c62d5b2326fc62b360c447c4c352a8032b2f5e14e8ff7a2b5a"} Oct 09 10:44:13 crc kubenswrapper[4740]: I1009 10:44:13.858353 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"bf9d09ae346256ac50290d24da81a3e63e0859abcc435a37c3dd8da746bb51a4"} Oct 09 10:44:13 crc kubenswrapper[4740]: I1009 10:44:13.858396 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a11218-32c1-4b40-a738-f56e795904d7","Type":"ContainerStarted","Data":"725b9bc8a9f78269eb1b6fa670d7bc3dd67b2e6caa0ef1587446f0f97ffecffd"} Oct 09 10:44:13 crc kubenswrapper[4740]: I1009 10:44:13.904271 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.523591324 podStartE2EDuration="25.904253357s" podCreationTimestamp="2025-10-09 10:43:48 +0000 UTC" firstStartedPulling="2025-10-09 10:44:06.165939743 +0000 UTC m=+985.128140134" lastFinishedPulling="2025-10-09 10:44:11.546601786 +0000 UTC m=+990.508802167" observedRunningTime="2025-10-09 10:44:13.903531087 +0000 UTC m=+992.865731498" watchObservedRunningTime="2025-10-09 10:44:13.904253357 +0000 UTC m=+992.866453738" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.156852 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-66lt6"] Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.158153 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.160898 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.210367 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-66lt6"] Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.246817 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.246856 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v28sx\" (UniqueName: \"kubernetes.io/projected/d40ec31e-4a50-4dae-a2b7-e48354125946-kube-api-access-v28sx\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.246936 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-config\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.246954 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.247109 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.247243 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.349041 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.349081 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v28sx\" (UniqueName: \"kubernetes.io/projected/d40ec31e-4a50-4dae-a2b7-e48354125946-kube-api-access-v28sx\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.349141 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-config\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.349156 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.349197 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.349250 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.350304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.350602 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.350669 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.350778 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-config\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.351002 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.371096 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v28sx\" (UniqueName: \"kubernetes.io/projected/d40ec31e-4a50-4dae-a2b7-e48354125946-kube-api-access-v28sx\") pod \"dnsmasq-dns-77585f5f8c-66lt6\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:14 crc kubenswrapper[4740]: I1009 10:44:14.472838 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:16 crc kubenswrapper[4740]: I1009 10:44:16.325189 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-c6rld" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.593159 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.732395 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-additional-scripts\") pod \"b7290258-99fc-46a4-9161-2457d4b859f3\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.732830 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-246gk\" (UniqueName: \"kubernetes.io/projected/b7290258-99fc-46a4-9161-2457d4b859f3-kube-api-access-246gk\") pod \"b7290258-99fc-46a4-9161-2457d4b859f3\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.732859 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run-ovn\") pod \"b7290258-99fc-46a4-9161-2457d4b859f3\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.732887 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run\") pod \"b7290258-99fc-46a4-9161-2457d4b859f3\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.732923 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-log-ovn\") pod \"b7290258-99fc-46a4-9161-2457d4b859f3\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.733027 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-scripts\") pod \"b7290258-99fc-46a4-9161-2457d4b859f3\" (UID: \"b7290258-99fc-46a4-9161-2457d4b859f3\") " Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.733017 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b7290258-99fc-46a4-9161-2457d4b859f3" (UID: "b7290258-99fc-46a4-9161-2457d4b859f3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.733069 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b7290258-99fc-46a4-9161-2457d4b859f3" (UID: "b7290258-99fc-46a4-9161-2457d4b859f3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.733032 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run" (OuterVolumeSpecName: "var-run") pod "b7290258-99fc-46a4-9161-2457d4b859f3" (UID: "b7290258-99fc-46a4-9161-2457d4b859f3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.733460 4740 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.733482 4740 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.733482 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b7290258-99fc-46a4-9161-2457d4b859f3" (UID: "b7290258-99fc-46a4-9161-2457d4b859f3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.733493 4740 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7290258-99fc-46a4-9161-2457d4b859f3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.734197 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-scripts" (OuterVolumeSpecName: "scripts") pod "b7290258-99fc-46a4-9161-2457d4b859f3" (UID: "b7290258-99fc-46a4-9161-2457d4b859f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.736464 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7290258-99fc-46a4-9161-2457d4b859f3-kube-api-access-246gk" (OuterVolumeSpecName: "kube-api-access-246gk") pod "b7290258-99fc-46a4-9161-2457d4b859f3" (UID: "b7290258-99fc-46a4-9161-2457d4b859f3"). InnerVolumeSpecName "kube-api-access-246gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.834911 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.834938 4740 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b7290258-99fc-46a4-9161-2457d4b859f3-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.834948 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-246gk\" (UniqueName: \"kubernetes.io/projected/b7290258-99fc-46a4-9161-2457d4b859f3-kube-api-access-246gk\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.923955 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld-config-6ckhr" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.924780 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6rld-config-6ckhr" event={"ID":"b7290258-99fc-46a4-9161-2457d4b859f3","Type":"ContainerDied","Data":"9919c10e10d3efafc51f3d26fed3ff7966dddfb4982e3fa808a4e9a2fa6a923e"} Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.924813 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9919c10e10d3efafc51f3d26fed3ff7966dddfb4982e3fa808a4e9a2fa6a923e" Oct 09 10:44:19 crc kubenswrapper[4740]: I1009 10:44:19.927409 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-66lt6"] Oct 09 10:44:19 crc kubenswrapper[4740]: W1009 10:44:19.935130 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd40ec31e_4a50_4dae_a2b7_e48354125946.slice/crio-ef480d5eabbdce7ee1ece0ccd6ba2c226f4b03b29c7eed76c825e35cb91fe083 WatchSource:0}: Error finding container ef480d5eabbdce7ee1ece0ccd6ba2c226f4b03b29c7eed76c825e35cb91fe083: Status 404 returned error can't find the container with id ef480d5eabbdce7ee1ece0ccd6ba2c226f4b03b29c7eed76c825e35cb91fe083 Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.718645 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c6rld-config-6ckhr"] Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.727158 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c6rld-config-6ckhr"] Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.803188 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c6rld-config-t4mt7"] Oct 09 10:44:20 crc kubenswrapper[4740]: E1009 10:44:20.803672 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7290258-99fc-46a4-9161-2457d4b859f3" containerName="ovn-config" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.803699 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7290258-99fc-46a4-9161-2457d4b859f3" containerName="ovn-config" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.803941 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7290258-99fc-46a4-9161-2457d4b859f3" containerName="ovn-config" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.804685 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.806767 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.828948 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6rld-config-t4mt7"] Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.852483 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9bhb\" (UniqueName: \"kubernetes.io/projected/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-kube-api-access-r9bhb\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.852559 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.852596 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run-ovn\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.852716 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-additional-scripts\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.852772 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-scripts\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.852888 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-log-ovn\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.939488 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cpnq8" event={"ID":"9c004cb4-8052-425c-ac2e-11159a708cad","Type":"ContainerStarted","Data":"cf09e3d7980f8112f9248ddd31242a6090428f7c3bf95ae6fb5ba6ee2994cf74"} Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.945959 4740 generic.go:334] "Generic (PLEG): container finished" podID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerID="c941dd3f5c24549c433058e708d5c4d473c0c6306323e4dce440f351a88f93c0" exitCode=0 Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.946002 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" event={"ID":"d40ec31e-4a50-4dae-a2b7-e48354125946","Type":"ContainerDied","Data":"c941dd3f5c24549c433058e708d5c4d473c0c6306323e4dce440f351a88f93c0"} Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.946034 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" event={"ID":"d40ec31e-4a50-4dae-a2b7-e48354125946","Type":"ContainerStarted","Data":"ef480d5eabbdce7ee1ece0ccd6ba2c226f4b03b29c7eed76c825e35cb91fe083"} Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955133 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-log-ovn\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955253 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9bhb\" (UniqueName: \"kubernetes.io/projected/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-kube-api-access-r9bhb\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955297 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955336 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run-ovn\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955393 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-additional-scripts\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955415 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-scripts\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955435 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-log-ovn\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955522 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.955622 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run-ovn\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.956369 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-additional-scripts\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.961201 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-cpnq8" podStartSLOduration=2.918537098 podStartE2EDuration="14.961181491s" podCreationTimestamp="2025-10-09 10:44:06 +0000 UTC" firstStartedPulling="2025-10-09 10:44:07.489901326 +0000 UTC m=+986.452101697" lastFinishedPulling="2025-10-09 10:44:19.532545709 +0000 UTC m=+998.494746090" observedRunningTime="2025-10-09 10:44:20.957115989 +0000 UTC m=+999.919316370" watchObservedRunningTime="2025-10-09 10:44:20.961181491 +0000 UTC m=+999.923381872" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.978839 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-scripts\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:20 crc kubenswrapper[4740]: I1009 10:44:20.983310 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9bhb\" (UniqueName: \"kubernetes.io/projected/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-kube-api-access-r9bhb\") pod \"ovn-controller-c6rld-config-t4mt7\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:21 crc kubenswrapper[4740]: I1009 10:44:21.123050 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:21 crc kubenswrapper[4740]: I1009 10:44:21.353979 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c6rld-config-t4mt7"] Oct 09 10:44:21 crc kubenswrapper[4740]: W1009 10:44:21.365680 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7421ee0c_c50b_4bfa_bd8e_81d555e3aca9.slice/crio-a90854918811a87f45315a9921d03221c506f0162f72423a246bccb465f82e55 WatchSource:0}: Error finding container a90854918811a87f45315a9921d03221c506f0162f72423a246bccb465f82e55: Status 404 returned error can't find the container with id a90854918811a87f45315a9921d03221c506f0162f72423a246bccb465f82e55 Oct 09 10:44:21 crc kubenswrapper[4740]: I1009 10:44:21.764979 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7290258-99fc-46a4-9161-2457d4b859f3" path="/var/lib/kubelet/pods/b7290258-99fc-46a4-9161-2457d4b859f3/volumes" Oct 09 10:44:21 crc kubenswrapper[4740]: I1009 10:44:21.955676 4740 generic.go:334] "Generic (PLEG): container finished" podID="7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" containerID="ee2fd56da6d03fafffab9c3ee1e97e3ee4fd8e9a23ca462b1ed0428116c0d9cb" exitCode=0 Oct 09 10:44:21 crc kubenswrapper[4740]: I1009 10:44:21.955806 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6rld-config-t4mt7" event={"ID":"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9","Type":"ContainerDied","Data":"ee2fd56da6d03fafffab9c3ee1e97e3ee4fd8e9a23ca462b1ed0428116c0d9cb"} Oct 09 10:44:21 crc kubenswrapper[4740]: I1009 10:44:21.956212 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6rld-config-t4mt7" event={"ID":"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9","Type":"ContainerStarted","Data":"a90854918811a87f45315a9921d03221c506f0162f72423a246bccb465f82e55"} Oct 09 10:44:21 crc kubenswrapper[4740]: I1009 10:44:21.959170 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" event={"ID":"d40ec31e-4a50-4dae-a2b7-e48354125946","Type":"ContainerStarted","Data":"e46f42a51bb24b608abd5dfea0bc4323aa6a816650f21f47036ac71ea4eebe4e"} Oct 09 10:44:21 crc kubenswrapper[4740]: I1009 10:44:21.959229 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.004903 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" podStartSLOduration=8.004883446 podStartE2EDuration="8.004883446s" podCreationTimestamp="2025-10-09 10:44:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:44:21.997432731 +0000 UTC m=+1000.959633122" watchObservedRunningTime="2025-10-09 10:44:22.004883446 +0000 UTC m=+1000.967083847" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.275007 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.562955 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.627142 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-r8w7m"] Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.628224 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r8w7m" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.653417 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r8w7m"] Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.682261 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwrfv\" (UniqueName: \"kubernetes.io/projected/1cd4630f-ae0a-422a-9a31-a5b833aa9f79-kube-api-access-zwrfv\") pod \"cinder-db-create-r8w7m\" (UID: \"1cd4630f-ae0a-422a-9a31-a5b833aa9f79\") " pod="openstack/cinder-db-create-r8w7m" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.733901 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8wxbl"] Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.735821 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8wxbl" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.754521 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8wxbl"] Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.786115 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwrfv\" (UniqueName: \"kubernetes.io/projected/1cd4630f-ae0a-422a-9a31-a5b833aa9f79-kube-api-access-zwrfv\") pod \"cinder-db-create-r8w7m\" (UID: \"1cd4630f-ae0a-422a-9a31-a5b833aa9f79\") " pod="openstack/cinder-db-create-r8w7m" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.786199 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g22fp\" (UniqueName: \"kubernetes.io/projected/7618ac8f-2d0b-49da-943b-13dd939652d0-kube-api-access-g22fp\") pod \"barbican-db-create-8wxbl\" (UID: \"7618ac8f-2d0b-49da-943b-13dd939652d0\") " pod="openstack/barbican-db-create-8wxbl" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.811197 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwrfv\" (UniqueName: \"kubernetes.io/projected/1cd4630f-ae0a-422a-9a31-a5b833aa9f79-kube-api-access-zwrfv\") pod \"cinder-db-create-r8w7m\" (UID: \"1cd4630f-ae0a-422a-9a31-a5b833aa9f79\") " pod="openstack/cinder-db-create-r8w7m" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.887905 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g22fp\" (UniqueName: \"kubernetes.io/projected/7618ac8f-2d0b-49da-943b-13dd939652d0-kube-api-access-g22fp\") pod \"barbican-db-create-8wxbl\" (UID: \"7618ac8f-2d0b-49da-943b-13dd939652d0\") " pod="openstack/barbican-db-create-8wxbl" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.907065 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g22fp\" (UniqueName: \"kubernetes.io/projected/7618ac8f-2d0b-49da-943b-13dd939652d0-kube-api-access-g22fp\") pod \"barbican-db-create-8wxbl\" (UID: \"7618ac8f-2d0b-49da-943b-13dd939652d0\") " pod="openstack/barbican-db-create-8wxbl" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.928119 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-btzp2"] Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.929339 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-btzp2" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.942510 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-btzp2"] Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.945843 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r8w7m" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.955552 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kh7ft"] Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.968909 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.987176 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.987382 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.987417 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52ljb" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.987876 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.988650 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-combined-ca-bundle\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.988896 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-config-data\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.989065 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bp9c\" (UniqueName: \"kubernetes.io/projected/365ec886-8ed3-4c64-a794-502bfef4fedf-kube-api-access-7bp9c\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.989609 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b86d\" (UniqueName: \"kubernetes.io/projected/93ce61e7-f832-4491-a80f-0f0bc24d15cd-kube-api-access-2b86d\") pod \"neutron-db-create-btzp2\" (UID: \"93ce61e7-f832-4491-a80f-0f0bc24d15cd\") " pod="openstack/neutron-db-create-btzp2" Oct 09 10:44:22 crc kubenswrapper[4740]: I1009 10:44:22.991051 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kh7ft"] Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.090745 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b86d\" (UniqueName: \"kubernetes.io/projected/93ce61e7-f832-4491-a80f-0f0bc24d15cd-kube-api-access-2b86d\") pod \"neutron-db-create-btzp2\" (UID: \"93ce61e7-f832-4491-a80f-0f0bc24d15cd\") " pod="openstack/neutron-db-create-btzp2" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.090863 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-combined-ca-bundle\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.090904 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-config-data\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.090984 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bp9c\" (UniqueName: \"kubernetes.io/projected/365ec886-8ed3-4c64-a794-502bfef4fedf-kube-api-access-7bp9c\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.091906 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8wxbl" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.097811 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-config-data\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.098498 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-combined-ca-bundle\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.121923 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b86d\" (UniqueName: \"kubernetes.io/projected/93ce61e7-f832-4491-a80f-0f0bc24d15cd-kube-api-access-2b86d\") pod \"neutron-db-create-btzp2\" (UID: \"93ce61e7-f832-4491-a80f-0f0bc24d15cd\") " pod="openstack/neutron-db-create-btzp2" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.126628 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bp9c\" (UniqueName: \"kubernetes.io/projected/365ec886-8ed3-4c64-a794-502bfef4fedf-kube-api-access-7bp9c\") pod \"keystone-db-sync-kh7ft\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.258255 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.262160 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-btzp2" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.294438 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-additional-scripts\") pod \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.294528 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9bhb\" (UniqueName: \"kubernetes.io/projected/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-kube-api-access-r9bhb\") pod \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.294592 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run-ovn\") pod \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.294675 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-scripts\") pod \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.294737 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-log-ovn\") pod \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.294782 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run\") pod \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\" (UID: \"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9\") " Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.295047 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" (UID: "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.295771 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" (UID: "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.296961 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-scripts" (OuterVolumeSpecName: "scripts") pod "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" (UID: "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.296997 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" (UID: "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.297013 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run" (OuterVolumeSpecName: "var-run") pod "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" (UID: "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.297136 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.307940 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-kube-api-access-r9bhb" (OuterVolumeSpecName: "kube-api-access-r9bhb") pod "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" (UID: "7421ee0c-c50b-4bfa-bd8e-81d555e3aca9"). InnerVolumeSpecName "kube-api-access-r9bhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.398548 4740 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.398587 4740 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.398622 4740 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.398638 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9bhb\" (UniqueName: \"kubernetes.io/projected/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-kube-api-access-r9bhb\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.398649 4740 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.398657 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.523684 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r8w7m"] Oct 09 10:44:23 crc kubenswrapper[4740]: W1009 10:44:23.525616 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cd4630f_ae0a_422a_9a31_a5b833aa9f79.slice/crio-f3c9644f9bbd67b9da2269d980ec06f7b53c790c073ec3c462dcc5df9555b714 WatchSource:0}: Error finding container f3c9644f9bbd67b9da2269d980ec06f7b53c790c073ec3c462dcc5df9555b714: Status 404 returned error can't find the container with id f3c9644f9bbd67b9da2269d980ec06f7b53c790c073ec3c462dcc5df9555b714 Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.630433 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8wxbl"] Oct 09 10:44:23 crc kubenswrapper[4740]: W1009 10:44:23.633615 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7618ac8f_2d0b_49da_943b_13dd939652d0.slice/crio-406bfb2b458f6d6c4878519e3f65fdf35e77bcc6a92227786e96f05b2180fb79 WatchSource:0}: Error finding container 406bfb2b458f6d6c4878519e3f65fdf35e77bcc6a92227786e96f05b2180fb79: Status 404 returned error can't find the container with id 406bfb2b458f6d6c4878519e3f65fdf35e77bcc6a92227786e96f05b2180fb79 Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.779274 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-btzp2"] Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.786014 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kh7ft"] Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.975018 4740 generic.go:334] "Generic (PLEG): container finished" podID="1cd4630f-ae0a-422a-9a31-a5b833aa9f79" containerID="50abcfd8c11eac6174bc771ce70680b1bd07a5a3b6b353876701c1c09e71815c" exitCode=0 Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.975475 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r8w7m" event={"ID":"1cd4630f-ae0a-422a-9a31-a5b833aa9f79","Type":"ContainerDied","Data":"50abcfd8c11eac6174bc771ce70680b1bd07a5a3b6b353876701c1c09e71815c"} Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.975512 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r8w7m" event={"ID":"1cd4630f-ae0a-422a-9a31-a5b833aa9f79","Type":"ContainerStarted","Data":"f3c9644f9bbd67b9da2269d980ec06f7b53c790c073ec3c462dcc5df9555b714"} Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.976964 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-btzp2" event={"ID":"93ce61e7-f832-4491-a80f-0f0bc24d15cd","Type":"ContainerStarted","Data":"b8b537abaf6300667a488d90e08dcb03ce8146d74708fdc939f118b58598fdad"} Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.980012 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kh7ft" event={"ID":"365ec886-8ed3-4c64-a794-502bfef4fedf","Type":"ContainerStarted","Data":"4ede428a4b70622ac0064ba3f143bff30e70a5fc1845ea85cb75fd4739e42f6e"} Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.981802 4740 generic.go:334] "Generic (PLEG): container finished" podID="7618ac8f-2d0b-49da-943b-13dd939652d0" containerID="5947c915f15c763c4aa2bccee5f0ac63b696b4ba5de58ec4500d3d837709bf20" exitCode=0 Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.981945 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8wxbl" event={"ID":"7618ac8f-2d0b-49da-943b-13dd939652d0","Type":"ContainerDied","Data":"5947c915f15c763c4aa2bccee5f0ac63b696b4ba5de58ec4500d3d837709bf20"} Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.981986 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8wxbl" event={"ID":"7618ac8f-2d0b-49da-943b-13dd939652d0","Type":"ContainerStarted","Data":"406bfb2b458f6d6c4878519e3f65fdf35e77bcc6a92227786e96f05b2180fb79"} Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.983732 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c6rld-config-t4mt7" event={"ID":"7421ee0c-c50b-4bfa-bd8e-81d555e3aca9","Type":"ContainerDied","Data":"a90854918811a87f45315a9921d03221c506f0162f72423a246bccb465f82e55"} Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.983797 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c6rld-config-t4mt7" Oct 09 10:44:23 crc kubenswrapper[4740]: I1009 10:44:23.983827 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90854918811a87f45315a9921d03221c506f0162f72423a246bccb465f82e55" Oct 09 10:44:24 crc kubenswrapper[4740]: I1009 10:44:24.341246 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c6rld-config-t4mt7"] Oct 09 10:44:24 crc kubenswrapper[4740]: I1009 10:44:24.346904 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c6rld-config-t4mt7"] Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.007953 4740 generic.go:334] "Generic (PLEG): container finished" podID="93ce61e7-f832-4491-a80f-0f0bc24d15cd" containerID="e95a494f4cec7122196c5455a913e1f022e67b48ab06ea717a5487b01b4353a2" exitCode=0 Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.008251 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-btzp2" event={"ID":"93ce61e7-f832-4491-a80f-0f0bc24d15cd","Type":"ContainerDied","Data":"e95a494f4cec7122196c5455a913e1f022e67b48ab06ea717a5487b01b4353a2"} Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.374423 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r8w7m" Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.381521 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8wxbl" Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.532412 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g22fp\" (UniqueName: \"kubernetes.io/projected/7618ac8f-2d0b-49da-943b-13dd939652d0-kube-api-access-g22fp\") pod \"7618ac8f-2d0b-49da-943b-13dd939652d0\" (UID: \"7618ac8f-2d0b-49da-943b-13dd939652d0\") " Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.532478 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwrfv\" (UniqueName: \"kubernetes.io/projected/1cd4630f-ae0a-422a-9a31-a5b833aa9f79-kube-api-access-zwrfv\") pod \"1cd4630f-ae0a-422a-9a31-a5b833aa9f79\" (UID: \"1cd4630f-ae0a-422a-9a31-a5b833aa9f79\") " Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.539000 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd4630f-ae0a-422a-9a31-a5b833aa9f79-kube-api-access-zwrfv" (OuterVolumeSpecName: "kube-api-access-zwrfv") pod "1cd4630f-ae0a-422a-9a31-a5b833aa9f79" (UID: "1cd4630f-ae0a-422a-9a31-a5b833aa9f79"). InnerVolumeSpecName "kube-api-access-zwrfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.553455 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7618ac8f-2d0b-49da-943b-13dd939652d0-kube-api-access-g22fp" (OuterVolumeSpecName: "kube-api-access-g22fp") pod "7618ac8f-2d0b-49da-943b-13dd939652d0" (UID: "7618ac8f-2d0b-49da-943b-13dd939652d0"). InnerVolumeSpecName "kube-api-access-g22fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.634917 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g22fp\" (UniqueName: \"kubernetes.io/projected/7618ac8f-2d0b-49da-943b-13dd939652d0-kube-api-access-g22fp\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.634964 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwrfv\" (UniqueName: \"kubernetes.io/projected/1cd4630f-ae0a-422a-9a31-a5b833aa9f79-kube-api-access-zwrfv\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:25 crc kubenswrapper[4740]: I1009 10:44:25.764789 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" path="/var/lib/kubelet/pods/7421ee0c-c50b-4bfa-bd8e-81d555e3aca9/volumes" Oct 09 10:44:26 crc kubenswrapper[4740]: I1009 10:44:26.019266 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8wxbl" Oct 09 10:44:26 crc kubenswrapper[4740]: I1009 10:44:26.020310 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8wxbl" event={"ID":"7618ac8f-2d0b-49da-943b-13dd939652d0","Type":"ContainerDied","Data":"406bfb2b458f6d6c4878519e3f65fdf35e77bcc6a92227786e96f05b2180fb79"} Oct 09 10:44:26 crc kubenswrapper[4740]: I1009 10:44:26.020336 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="406bfb2b458f6d6c4878519e3f65fdf35e77bcc6a92227786e96f05b2180fb79" Oct 09 10:44:26 crc kubenswrapper[4740]: I1009 10:44:26.021990 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r8w7m" event={"ID":"1cd4630f-ae0a-422a-9a31-a5b833aa9f79","Type":"ContainerDied","Data":"f3c9644f9bbd67b9da2269d980ec06f7b53c790c073ec3c462dcc5df9555b714"} Oct 09 10:44:26 crc kubenswrapper[4740]: I1009 10:44:26.022036 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3c9644f9bbd67b9da2269d980ec06f7b53c790c073ec3c462dcc5df9555b714" Oct 09 10:44:26 crc kubenswrapper[4740]: I1009 10:44:26.022015 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r8w7m" Oct 09 10:44:28 crc kubenswrapper[4740]: I1009 10:44:28.038079 4740 generic.go:334] "Generic (PLEG): container finished" podID="9c004cb4-8052-425c-ac2e-11159a708cad" containerID="cf09e3d7980f8112f9248ddd31242a6090428f7c3bf95ae6fb5ba6ee2994cf74" exitCode=0 Oct 09 10:44:28 crc kubenswrapper[4740]: I1009 10:44:28.038159 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cpnq8" event={"ID":"9c004cb4-8052-425c-ac2e-11159a708cad","Type":"ContainerDied","Data":"cf09e3d7980f8112f9248ddd31242a6090428f7c3bf95ae6fb5ba6ee2994cf74"} Oct 09 10:44:28 crc kubenswrapper[4740]: I1009 10:44:28.728997 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-btzp2" Oct 09 10:44:28 crc kubenswrapper[4740]: I1009 10:44:28.890789 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b86d\" (UniqueName: \"kubernetes.io/projected/93ce61e7-f832-4491-a80f-0f0bc24d15cd-kube-api-access-2b86d\") pod \"93ce61e7-f832-4491-a80f-0f0bc24d15cd\" (UID: \"93ce61e7-f832-4491-a80f-0f0bc24d15cd\") " Oct 09 10:44:28 crc kubenswrapper[4740]: I1009 10:44:28.894884 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ce61e7-f832-4491-a80f-0f0bc24d15cd-kube-api-access-2b86d" (OuterVolumeSpecName: "kube-api-access-2b86d") pod "93ce61e7-f832-4491-a80f-0f0bc24d15cd" (UID: "93ce61e7-f832-4491-a80f-0f0bc24d15cd"). InnerVolumeSpecName "kube-api-access-2b86d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:28 crc kubenswrapper[4740]: I1009 10:44:28.997169 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b86d\" (UniqueName: \"kubernetes.io/projected/93ce61e7-f832-4491-a80f-0f0bc24d15cd-kube-api-access-2b86d\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.050114 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-btzp2" event={"ID":"93ce61e7-f832-4491-a80f-0f0bc24d15cd","Type":"ContainerDied","Data":"b8b537abaf6300667a488d90e08dcb03ce8146d74708fdc939f118b58598fdad"} Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.050159 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8b537abaf6300667a488d90e08dcb03ce8146d74708fdc939f118b58598fdad" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.050240 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-btzp2" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.061240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kh7ft" event={"ID":"365ec886-8ed3-4c64-a794-502bfef4fedf","Type":"ContainerStarted","Data":"1f1e975d16d493d8fdbe5f5ab458c032d2611afb170e4e49909e38dfdb50620c"} Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.085488 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kh7ft" podStartSLOduration=2.332026089 podStartE2EDuration="7.085465111s" podCreationTimestamp="2025-10-09 10:44:22 +0000 UTC" firstStartedPulling="2025-10-09 10:44:23.820886691 +0000 UTC m=+1002.783087072" lastFinishedPulling="2025-10-09 10:44:28.574325713 +0000 UTC m=+1007.536526094" observedRunningTime="2025-10-09 10:44:29.083028964 +0000 UTC m=+1008.045229365" watchObservedRunningTime="2025-10-09 10:44:29.085465111 +0000 UTC m=+1008.047665502" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.359510 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.476072 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.515209 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-combined-ca-bundle\") pod \"9c004cb4-8052-425c-ac2e-11159a708cad\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.515331 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-config-data\") pod \"9c004cb4-8052-425c-ac2e-11159a708cad\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.515415 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mspnf\" (UniqueName: \"kubernetes.io/projected/9c004cb4-8052-425c-ac2e-11159a708cad-kube-api-access-mspnf\") pod \"9c004cb4-8052-425c-ac2e-11159a708cad\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.515450 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-db-sync-config-data\") pod \"9c004cb4-8052-425c-ac2e-11159a708cad\" (UID: \"9c004cb4-8052-425c-ac2e-11159a708cad\") " Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.522929 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9c004cb4-8052-425c-ac2e-11159a708cad" (UID: "9c004cb4-8052-425c-ac2e-11159a708cad"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.526002 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c004cb4-8052-425c-ac2e-11159a708cad-kube-api-access-mspnf" (OuterVolumeSpecName: "kube-api-access-mspnf") pod "9c004cb4-8052-425c-ac2e-11159a708cad" (UID: "9c004cb4-8052-425c-ac2e-11159a708cad"). InnerVolumeSpecName "kube-api-access-mspnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.538963 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xfmk7"] Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.539185 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-xfmk7" podUID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" containerName="dnsmasq-dns" containerID="cri-o://5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874" gracePeriod=10 Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.572150 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c004cb4-8052-425c-ac2e-11159a708cad" (UID: "9c004cb4-8052-425c-ac2e-11159a708cad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.591896 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-config-data" (OuterVolumeSpecName: "config-data") pod "9c004cb4-8052-425c-ac2e-11159a708cad" (UID: "9c004cb4-8052-425c-ac2e-11159a708cad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.617279 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.617395 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.617420 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mspnf\" (UniqueName: \"kubernetes.io/projected/9c004cb4-8052-425c-ac2e-11159a708cad-kube-api-access-mspnf\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.617432 4740 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c004cb4-8052-425c-ac2e-11159a708cad-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:29 crc kubenswrapper[4740]: I1009 10:44:29.971401 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.081815 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cpnq8" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.081828 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cpnq8" event={"ID":"9c004cb4-8052-425c-ac2e-11159a708cad","Type":"ContainerDied","Data":"400f178c24606656dc27d9d7d445061afb791034ad4fcd048757b0e77eceac9b"} Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.081879 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="400f178c24606656dc27d9d7d445061afb791034ad4fcd048757b0e77eceac9b" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.085542 4740 generic.go:334] "Generic (PLEG): container finished" podID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" containerID="5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874" exitCode=0 Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.085985 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xfmk7" event={"ID":"15e8a2fd-eacd-4108-af4c-355e0e923d2d","Type":"ContainerDied","Data":"5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874"} Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.086072 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xfmk7" event={"ID":"15e8a2fd-eacd-4108-af4c-355e0e923d2d","Type":"ContainerDied","Data":"88e458d6b3d38aaaf2693c65b40764b2b55e4e6427796ba90a0583017aeeb32e"} Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.086009 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xfmk7" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.086115 4740 scope.go:117] "RemoveContainer" containerID="5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.110686 4740 scope.go:117] "RemoveContainer" containerID="e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.125788 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-config\") pod \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.126050 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-dns-svc\") pod \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.126077 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phwgn\" (UniqueName: \"kubernetes.io/projected/15e8a2fd-eacd-4108-af4c-355e0e923d2d-kube-api-access-phwgn\") pod \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.126102 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-nb\") pod \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.126175 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-sb\") pod \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\" (UID: \"15e8a2fd-eacd-4108-af4c-355e0e923d2d\") " Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.127798 4740 scope.go:117] "RemoveContainer" containerID="5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874" Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.128360 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874\": container with ID starting with 5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874 not found: ID does not exist" containerID="5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.128405 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874"} err="failed to get container status \"5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874\": rpc error: code = NotFound desc = could not find container \"5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874\": container with ID starting with 5fe8a375a64307f65032de6fa7d5f83e4375c39d9d4146fd082b66c7f2f5f874 not found: ID does not exist" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.128440 4740 scope.go:117] "RemoveContainer" containerID="e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914" Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.129291 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914\": container with ID starting with e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914 not found: ID does not exist" containerID="e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.129328 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914"} err="failed to get container status \"e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914\": rpc error: code = NotFound desc = could not find container \"e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914\": container with ID starting with e37245c82cc0511a51cab427ce153f87ab6df9e19d64d3b62fc3e1e7e99c9914 not found: ID does not exist" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.130903 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e8a2fd-eacd-4108-af4c-355e0e923d2d-kube-api-access-phwgn" (OuterVolumeSpecName: "kube-api-access-phwgn") pod "15e8a2fd-eacd-4108-af4c-355e0e923d2d" (UID: "15e8a2fd-eacd-4108-af4c-355e0e923d2d"). InnerVolumeSpecName "kube-api-access-phwgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.163218 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "15e8a2fd-eacd-4108-af4c-355e0e923d2d" (UID: "15e8a2fd-eacd-4108-af4c-355e0e923d2d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.167086 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15e8a2fd-eacd-4108-af4c-355e0e923d2d" (UID: "15e8a2fd-eacd-4108-af4c-355e0e923d2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.168114 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "15e8a2fd-eacd-4108-af4c-355e0e923d2d" (UID: "15e8a2fd-eacd-4108-af4c-355e0e923d2d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.198485 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-config" (OuterVolumeSpecName: "config") pod "15e8a2fd-eacd-4108-af4c-355e0e923d2d" (UID: "15e8a2fd-eacd-4108-af4c-355e0e923d2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.228263 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.228305 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.228318 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phwgn\" (UniqueName: \"kubernetes.io/projected/15e8a2fd-eacd-4108-af4c-355e0e923d2d-kube-api-access-phwgn\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.228333 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.228346 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15e8a2fd-eacd-4108-af4c-355e0e923d2d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.430507 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-nrzhg"] Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.430962 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c004cb4-8052-425c-ac2e-11159a708cad" containerName="glance-db-sync" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.430979 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c004cb4-8052-425c-ac2e-11159a708cad" containerName="glance-db-sync" Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.430999 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" containerName="init" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431022 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" containerName="init" Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.431035 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" containerName="ovn-config" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431040 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" containerName="ovn-config" Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.431056 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd4630f-ae0a-422a-9a31-a5b833aa9f79" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431061 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd4630f-ae0a-422a-9a31-a5b833aa9f79" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.431071 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ce61e7-f832-4491-a80f-0f0bc24d15cd" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431077 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ce61e7-f832-4491-a80f-0f0bc24d15cd" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.431108 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" containerName="dnsmasq-dns" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431114 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" containerName="dnsmasq-dns" Oct 09 10:44:30 crc kubenswrapper[4740]: E1009 10:44:30.431121 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7618ac8f-2d0b-49da-943b-13dd939652d0" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431127 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7618ac8f-2d0b-49da-943b-13dd939652d0" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431305 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ce61e7-f832-4491-a80f-0f0bc24d15cd" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431316 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c004cb4-8052-425c-ac2e-11159a708cad" containerName="glance-db-sync" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431325 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd4630f-ae0a-422a-9a31-a5b833aa9f79" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431352 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7618ac8f-2d0b-49da-943b-13dd939652d0" containerName="mariadb-database-create" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431360 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" containerName="dnsmasq-dns" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.431373 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7421ee0c-c50b-4bfa-bd8e-81d555e3aca9" containerName="ovn-config" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.432796 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.437856 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xfmk7"] Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.451082 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xfmk7"] Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.462901 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-nrzhg"] Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.536497 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-config\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.536555 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbqp\" (UniqueName: \"kubernetes.io/projected/08b9d2f6-45a1-4325-97a9-770a2f727ab4-kube-api-access-xwbqp\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.536601 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.536699 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.536727 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.537013 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.638382 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-config\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.638434 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbqp\" (UniqueName: \"kubernetes.io/projected/08b9d2f6-45a1-4325-97a9-770a2f727ab4-kube-api-access-xwbqp\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.638455 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.638529 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.638567 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.638649 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.639639 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.639713 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.639870 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-config\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.640243 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.640715 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.655311 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbqp\" (UniqueName: \"kubernetes.io/projected/08b9d2f6-45a1-4325-97a9-770a2f727ab4-kube-api-access-xwbqp\") pod \"dnsmasq-dns-7ff5475cc9-nrzhg\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:30 crc kubenswrapper[4740]: I1009 10:44:30.754631 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:31 crc kubenswrapper[4740]: I1009 10:44:31.202101 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-nrzhg"] Oct 09 10:44:31 crc kubenswrapper[4740]: W1009 10:44:31.211684 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08b9d2f6_45a1_4325_97a9_770a2f727ab4.slice/crio-ccf3833ae5953b4c792c73ded601adbbe44c59146a71b07a43879dbea3f6a9f3 WatchSource:0}: Error finding container ccf3833ae5953b4c792c73ded601adbbe44c59146a71b07a43879dbea3f6a9f3: Status 404 returned error can't find the container with id ccf3833ae5953b4c792c73ded601adbbe44c59146a71b07a43879dbea3f6a9f3 Oct 09 10:44:31 crc kubenswrapper[4740]: I1009 10:44:31.771728 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e8a2fd-eacd-4108-af4c-355e0e923d2d" path="/var/lib/kubelet/pods/15e8a2fd-eacd-4108-af4c-355e0e923d2d/volumes" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.101941 4740 generic.go:334] "Generic (PLEG): container finished" podID="365ec886-8ed3-4c64-a794-502bfef4fedf" containerID="1f1e975d16d493d8fdbe5f5ab458c032d2611afb170e4e49909e38dfdb50620c" exitCode=0 Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.102104 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kh7ft" event={"ID":"365ec886-8ed3-4c64-a794-502bfef4fedf","Type":"ContainerDied","Data":"1f1e975d16d493d8fdbe5f5ab458c032d2611afb170e4e49909e38dfdb50620c"} Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.103494 4740 generic.go:334] "Generic (PLEG): container finished" podID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" containerID="3617c9d6219416895d96401839488d8cc4b33d7d1cbcf33b25dc2ae7ea7e0ede" exitCode=0 Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.103523 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" event={"ID":"08b9d2f6-45a1-4325-97a9-770a2f727ab4","Type":"ContainerDied","Data":"3617c9d6219416895d96401839488d8cc4b33d7d1cbcf33b25dc2ae7ea7e0ede"} Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.103544 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" event={"ID":"08b9d2f6-45a1-4325-97a9-770a2f727ab4","Type":"ContainerStarted","Data":"ccf3833ae5953b4c792c73ded601adbbe44c59146a71b07a43879dbea3f6a9f3"} Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.630179 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-dc53-account-create-56z9f"] Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.631232 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dc53-account-create-56z9f" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.633075 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.640312 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-dc53-account-create-56z9f"] Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.674884 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7nqj\" (UniqueName: \"kubernetes.io/projected/dbcc00a8-003c-48f3-b7e5-5bade54830fe-kube-api-access-v7nqj\") pod \"barbican-dc53-account-create-56z9f\" (UID: \"dbcc00a8-003c-48f3-b7e5-5bade54830fe\") " pod="openstack/barbican-dc53-account-create-56z9f" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.728131 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4f86-account-create-b7lmp"] Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.729825 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4f86-account-create-b7lmp" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.732869 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.739385 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4f86-account-create-b7lmp"] Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.776321 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7nqj\" (UniqueName: \"kubernetes.io/projected/dbcc00a8-003c-48f3-b7e5-5bade54830fe-kube-api-access-v7nqj\") pod \"barbican-dc53-account-create-56z9f\" (UID: \"dbcc00a8-003c-48f3-b7e5-5bade54830fe\") " pod="openstack/barbican-dc53-account-create-56z9f" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.776396 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8t7\" (UniqueName: \"kubernetes.io/projected/da198044-a030-4982-bdb9-9a232c4a1191-kube-api-access-7n8t7\") pod \"cinder-4f86-account-create-b7lmp\" (UID: \"da198044-a030-4982-bdb9-9a232c4a1191\") " pod="openstack/cinder-4f86-account-create-b7lmp" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.793391 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7nqj\" (UniqueName: \"kubernetes.io/projected/dbcc00a8-003c-48f3-b7e5-5bade54830fe-kube-api-access-v7nqj\") pod \"barbican-dc53-account-create-56z9f\" (UID: \"dbcc00a8-003c-48f3-b7e5-5bade54830fe\") " pod="openstack/barbican-dc53-account-create-56z9f" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.877700 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8t7\" (UniqueName: \"kubernetes.io/projected/da198044-a030-4982-bdb9-9a232c4a1191-kube-api-access-7n8t7\") pod \"cinder-4f86-account-create-b7lmp\" (UID: \"da198044-a030-4982-bdb9-9a232c4a1191\") " pod="openstack/cinder-4f86-account-create-b7lmp" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.894560 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8t7\" (UniqueName: \"kubernetes.io/projected/da198044-a030-4982-bdb9-9a232c4a1191-kube-api-access-7n8t7\") pod \"cinder-4f86-account-create-b7lmp\" (UID: \"da198044-a030-4982-bdb9-9a232c4a1191\") " pod="openstack/cinder-4f86-account-create-b7lmp" Oct 09 10:44:32 crc kubenswrapper[4740]: I1009 10:44:32.948738 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dc53-account-create-56z9f" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.051322 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4f86-account-create-b7lmp" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.123563 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" event={"ID":"08b9d2f6-45a1-4325-97a9-770a2f727ab4","Type":"ContainerStarted","Data":"5dc8f80fb1a4b163e0cf256a0fa59999db07781e836d61de0481bad7dcdcf9b0"} Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.123613 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.153753 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" podStartSLOduration=3.153579814 podStartE2EDuration="3.153579814s" podCreationTimestamp="2025-10-09 10:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:44:33.149497152 +0000 UTC m=+1012.111697573" watchObservedRunningTime="2025-10-09 10:44:33.153579814 +0000 UTC m=+1012.115780235" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.413302 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-dc53-account-create-56z9f"] Oct 09 10:44:33 crc kubenswrapper[4740]: W1009 10:44:33.416223 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbcc00a8_003c_48f3_b7e5_5bade54830fe.slice/crio-986d557d0b6ffdc7e43d5c7f40ab650fba45fa5d1976af044ce41693b3b90fab WatchSource:0}: Error finding container 986d557d0b6ffdc7e43d5c7f40ab650fba45fa5d1976af044ce41693b3b90fab: Status 404 returned error can't find the container with id 986d557d0b6ffdc7e43d5c7f40ab650fba45fa5d1976af044ce41693b3b90fab Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.488277 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4f86-account-create-b7lmp"] Oct 09 10:44:33 crc kubenswrapper[4740]: W1009 10:44:33.549983 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda198044_a030_4982_bdb9_9a232c4a1191.slice/crio-c2a3538e66ea27e009c64f537bb8092b357074cda5e543e424e3d7af31173508 WatchSource:0}: Error finding container c2a3538e66ea27e009c64f537bb8092b357074cda5e543e424e3d7af31173508: Status 404 returned error can't find the container with id c2a3538e66ea27e009c64f537bb8092b357074cda5e543e424e3d7af31173508 Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.551635 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.695022 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-config-data\") pod \"365ec886-8ed3-4c64-a794-502bfef4fedf\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.695260 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bp9c\" (UniqueName: \"kubernetes.io/projected/365ec886-8ed3-4c64-a794-502bfef4fedf-kube-api-access-7bp9c\") pod \"365ec886-8ed3-4c64-a794-502bfef4fedf\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.695354 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-combined-ca-bundle\") pod \"365ec886-8ed3-4c64-a794-502bfef4fedf\" (UID: \"365ec886-8ed3-4c64-a794-502bfef4fedf\") " Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.701832 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365ec886-8ed3-4c64-a794-502bfef4fedf-kube-api-access-7bp9c" (OuterVolumeSpecName: "kube-api-access-7bp9c") pod "365ec886-8ed3-4c64-a794-502bfef4fedf" (UID: "365ec886-8ed3-4c64-a794-502bfef4fedf"). InnerVolumeSpecName "kube-api-access-7bp9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.723344 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "365ec886-8ed3-4c64-a794-502bfef4fedf" (UID: "365ec886-8ed3-4c64-a794-502bfef4fedf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.742536 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-config-data" (OuterVolumeSpecName: "config-data") pod "365ec886-8ed3-4c64-a794-502bfef4fedf" (UID: "365ec886-8ed3-4c64-a794-502bfef4fedf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.797518 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bp9c\" (UniqueName: \"kubernetes.io/projected/365ec886-8ed3-4c64-a794-502bfef4fedf-kube-api-access-7bp9c\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.797571 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:33 crc kubenswrapper[4740]: I1009 10:44:33.797581 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/365ec886-8ed3-4c64-a794-502bfef4fedf-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.134177 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kh7ft" event={"ID":"365ec886-8ed3-4c64-a794-502bfef4fedf","Type":"ContainerDied","Data":"4ede428a4b70622ac0064ba3f143bff30e70a5fc1845ea85cb75fd4739e42f6e"} Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.134214 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ede428a4b70622ac0064ba3f143bff30e70a5fc1845ea85cb75fd4739e42f6e" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.134238 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kh7ft" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.135617 4740 generic.go:334] "Generic (PLEG): container finished" podID="da198044-a030-4982-bdb9-9a232c4a1191" containerID="3fd82077a9bea0733b9951d5635c15b01d30d6035800b1115bbc23f7e25999f4" exitCode=0 Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.135722 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4f86-account-create-b7lmp" event={"ID":"da198044-a030-4982-bdb9-9a232c4a1191","Type":"ContainerDied","Data":"3fd82077a9bea0733b9951d5635c15b01d30d6035800b1115bbc23f7e25999f4"} Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.135755 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4f86-account-create-b7lmp" event={"ID":"da198044-a030-4982-bdb9-9a232c4a1191","Type":"ContainerStarted","Data":"c2a3538e66ea27e009c64f537bb8092b357074cda5e543e424e3d7af31173508"} Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.138236 4740 generic.go:334] "Generic (PLEG): container finished" podID="dbcc00a8-003c-48f3-b7e5-5bade54830fe" containerID="afa0bcca481f8ce37f904baba840c3e1bad385de4188ae6934673bd790799008" exitCode=0 Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.138593 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dc53-account-create-56z9f" event={"ID":"dbcc00a8-003c-48f3-b7e5-5bade54830fe","Type":"ContainerDied","Data":"afa0bcca481f8ce37f904baba840c3e1bad385de4188ae6934673bd790799008"} Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.138624 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dc53-account-create-56z9f" event={"ID":"dbcc00a8-003c-48f3-b7e5-5bade54830fe","Type":"ContainerStarted","Data":"986d557d0b6ffdc7e43d5c7f40ab650fba45fa5d1976af044ce41693b3b90fab"} Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.387912 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-nrzhg"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.410679 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vjfrh"] Oct 09 10:44:34 crc kubenswrapper[4740]: E1009 10:44:34.411073 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365ec886-8ed3-4c64-a794-502bfef4fedf" containerName="keystone-db-sync" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.411092 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="365ec886-8ed3-4c64-a794-502bfef4fedf" containerName="keystone-db-sync" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.411564 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="365ec886-8ed3-4c64-a794-502bfef4fedf" containerName="keystone-db-sync" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.417957 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.421688 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.421989 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.426918 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.432295 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52ljb" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.462737 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vjfrh"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.494847 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.496171 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.506526 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509744 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509796 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-scripts\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509815 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-credential-keys\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509835 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n92m\" (UniqueName: \"kubernetes.io/projected/9795efc1-c793-4c31-b34c-c86cb6ea3bca-kube-api-access-4n92m\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509853 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-fernet-keys\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509878 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-config-data\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509906 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-combined-ca-bundle\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509931 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509951 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-config\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509966 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.509998 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z69m\" (UniqueName: \"kubernetes.io/projected/2207949f-1d38-47c3-9573-24dfa0a2db9e-kube-api-access-7z69m\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.510012 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.619593 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-combined-ca-bundle\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.619922 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620027 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-config\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620106 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620195 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z69m\" (UniqueName: \"kubernetes.io/projected/2207949f-1d38-47c3-9573-24dfa0a2db9e-kube-api-access-7z69m\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620351 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620441 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620520 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-scripts\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620587 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-credential-keys\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620668 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n92m\" (UniqueName: \"kubernetes.io/projected/9795efc1-c793-4c31-b34c-c86cb6ea3bca-kube-api-access-4n92m\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620748 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-fernet-keys\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620853 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-config-data\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.620994 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.621035 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.621757 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-config\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.628025 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-fernet-keys\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.628393 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.628553 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.632982 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-config-data\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.633731 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-scripts\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.638237 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5584f4df97-wsq5t"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.641762 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.645404 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-combined-ca-bundle\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.647367 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-credential-keys\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.651465 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.651686 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8nt6k" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.652421 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.657988 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.658280 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n92m\" (UniqueName: \"kubernetes.io/projected/9795efc1-c793-4c31-b34c-c86cb6ea3bca-kube-api-access-4n92m\") pod \"dnsmasq-dns-5c5cc7c5ff-lzr2j\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.681666 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5584f4df97-wsq5t"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.685239 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z69m\" (UniqueName: \"kubernetes.io/projected/2207949f-1d38-47c3-9573-24dfa0a2db9e-kube-api-access-7z69m\") pod \"keystone-bootstrap-vjfrh\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.721364 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.721962 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.723470 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-config-data\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.723801 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-scripts\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.723836 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71cf9683-bcf4-4367-8365-08ef2fbe73d5-horizon-secret-key\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.723869 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js2kx\" (UniqueName: \"kubernetes.io/projected/71cf9683-bcf4-4367-8365-08ef2fbe73d5-kube-api-access-js2kx\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.723943 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf9683-bcf4-4367-8365-08ef2fbe73d5-logs\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.764863 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-tr9f9"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.773065 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.800975 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.801119 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.802360 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.805827 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.806365 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.806497 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.815126 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gdptk" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.815285 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-tr9f9"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826446 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-config-data\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826487 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-scripts\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826552 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71cf9683-bcf4-4367-8365-08ef2fbe73d5-horizon-secret-key\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826586 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826606 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js2kx\" (UniqueName: \"kubernetes.io/projected/71cf9683-bcf4-4367-8365-08ef2fbe73d5-kube-api-access-js2kx\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826656 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-config\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826684 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826710 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826809 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826828 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf9683-bcf4-4367-8365-08ef2fbe73d5-logs\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.826860 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgrl\" (UniqueName: \"kubernetes.io/projected/3e84b517-ac89-462b-baae-559a25766f7e-kube-api-access-8pgrl\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.828246 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-config-data\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.828659 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-scripts\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.834392 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dzzbk"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.842160 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71cf9683-bcf4-4367-8365-08ef2fbe73d5-horizon-secret-key\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.843648 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf9683-bcf4-4367-8365-08ef2fbe73d5-logs\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.845342 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.851072 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.851368 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-47wgr" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.851645 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.884412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js2kx\" (UniqueName: \"kubernetes.io/projected/71cf9683-bcf4-4367-8365-08ef2fbe73d5-kube-api-access-js2kx\") pod \"horizon-5584f4df97-wsq5t\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929659 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929718 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929734 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t2xv\" (UniqueName: \"kubernetes.io/projected/996070ad-888f-48c2-a368-de97d22b13c1-kube-api-access-6t2xv\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929791 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-config\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929812 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929830 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929848 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929876 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929899 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929913 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929931 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgrl\" (UniqueName: \"kubernetes.io/projected/3e84b517-ac89-462b-baae-559a25766f7e-kube-api-access-8pgrl\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929951 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.929964 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.930006 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-logs\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.930922 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.932347 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.933277 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-config\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.936985 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.939804 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d99fcb759-vc4hb"] Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.945537 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.946084 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.950460 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgrl\" (UniqueName: \"kubernetes.io/projected/3e84b517-ac89-462b-baae-559a25766f7e-kube-api-access-8pgrl\") pod \"dnsmasq-dns-8b5c85b87-tr9f9\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.950547 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:44:34 crc kubenswrapper[4740]: I1009 10:44:34.956962 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dzzbk"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:34.993832 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d99fcb759-vc4hb"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.004655 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.011679 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.015979 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.018547 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.018770 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.037522 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.040907 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5rqm\" (UniqueName: \"kubernetes.io/projected/be99ba98-fb4b-4609-986e-3636a4a8f244-kube-api-access-m5rqm\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.041288 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.041317 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t2xv\" (UniqueName: \"kubernetes.io/projected/996070ad-888f-48c2-a368-de97d22b13c1-kube-api-access-6t2xv\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.042341 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.043787 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-scripts\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.043830 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.043970 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.044028 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.044076 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.044099 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.044166 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ba98-fb4b-4609-986e-3636a4a8f244-logs\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.044192 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-combined-ca-bundle\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.044249 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-logs\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.044267 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-config-data\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.045710 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.045543 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.046110 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-logs\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.049262 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.065981 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t2xv\" (UniqueName: \"kubernetes.io/projected/996070ad-888f-48c2-a368-de97d22b13c1-kube-api-access-6t2xv\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.072537 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.074232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.079884 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.082710 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.082980 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.083555 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.085161 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.100265 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.101876 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.147608 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x29x2\" (UniqueName: \"kubernetes.io/projected/1f91334a-239f-4459-b885-aa9865bc6a04-kube-api-access-x29x2\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.147698 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.147892 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-scripts\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.147949 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-run-httpd\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.147972 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6176777d-0028-4420-a29c-cbf0b361c378-horizon-secret-key\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148032 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6176777d-0028-4420-a29c-cbf0b361c378-logs\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148095 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-config-data\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148138 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-log-httpd\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148280 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148323 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ba98-fb4b-4609-986e-3636a4a8f244-logs\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148377 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-scripts\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148404 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-combined-ca-bundle\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148468 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-scripts\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148497 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-config-data\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148555 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l582g\" (UniqueName: \"kubernetes.io/projected/6176777d-0028-4420-a29c-cbf0b361c378-kube-api-access-l582g\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5rqm\" (UniqueName: \"kubernetes.io/projected/be99ba98-fb4b-4609-986e-3636a4a8f244-kube-api-access-m5rqm\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.148640 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-config-data\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.150841 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ba98-fb4b-4609-986e-3636a4a8f244-logs\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.171415 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-config-data\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.172402 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5rqm\" (UniqueName: \"kubernetes.io/projected/be99ba98-fb4b-4609-986e-3636a4a8f244-kube-api-access-m5rqm\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.175159 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-scripts\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.175638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-combined-ca-bundle\") pod \"placement-db-sync-dzzbk\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249638 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbpq6\" (UniqueName: \"kubernetes.io/projected/d9a2cdb4-f238-48ed-a0be-d3f895c72868-kube-api-access-mbpq6\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249719 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249740 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-scripts\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249800 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-scripts\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249835 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249853 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l582g\" (UniqueName: \"kubernetes.io/projected/6176777d-0028-4420-a29c-cbf0b361c378-kube-api-access-l582g\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249875 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249894 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-config-data\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249909 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249945 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249960 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.249977 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x29x2\" (UniqueName: \"kubernetes.io/projected/1f91334a-239f-4459-b885-aa9865bc6a04-kube-api-access-x29x2\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.250013 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.250036 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.250075 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-run-httpd\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.250090 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6176777d-0028-4420-a29c-cbf0b361c378-horizon-secret-key\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.250106 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6176777d-0028-4420-a29c-cbf0b361c378-logs\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.250127 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-config-data\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.250166 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.250219 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-log-httpd\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.252959 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6176777d-0028-4420-a29c-cbf0b361c378-logs\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.253870 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-log-httpd\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.255613 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-config-data\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.256042 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-scripts\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.256897 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-run-httpd\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.259637 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.260446 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6176777d-0028-4420-a29c-cbf0b361c378-horizon-secret-key\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.265412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-scripts\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.269649 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x29x2\" (UniqueName: \"kubernetes.io/projected/1f91334a-239f-4459-b885-aa9865bc6a04-kube-api-access-x29x2\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.274987 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.275655 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-config-data\") pod \"ceilometer-0\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.283926 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l582g\" (UniqueName: \"kubernetes.io/projected/6176777d-0028-4420-a29c-cbf0b361c378-kube-api-access-l582g\") pod \"horizon-6d99fcb759-vc4hb\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.351363 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.351793 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.351841 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbpq6\" (UniqueName: \"kubernetes.io/projected/d9a2cdb4-f238-48ed-a0be-d3f895c72868-kube-api-access-mbpq6\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.351934 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.351955 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.351980 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.352071 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.352107 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.356201 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.356547 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.357062 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.360507 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.352111 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.376359 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.379051 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.398135 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.407471 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbpq6\" (UniqueName: \"kubernetes.io/projected/d9a2cdb4-f238-48ed-a0be-d3f895c72868-kube-api-access-mbpq6\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.410941 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzzbk" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.416584 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vjfrh"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.427342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: W1009 10:44:35.429718 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2207949f_1d38_47c3_9573_24dfa0a2db9e.slice/crio-2f001b33a425889125ee7a3399b4b0014d1b80e919c25180f57423e19fc321d9 WatchSource:0}: Error finding container 2f001b33a425889125ee7a3399b4b0014d1b80e919c25180f57423e19fc321d9: Status 404 returned error can't find the container with id 2f001b33a425889125ee7a3399b4b0014d1b80e919c25180f57423e19fc321d9 Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.476946 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.484369 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.492912 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.507474 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.718351 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dc53-account-create-56z9f" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.870962 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7nqj\" (UniqueName: \"kubernetes.io/projected/dbcc00a8-003c-48f3-b7e5-5bade54830fe-kube-api-access-v7nqj\") pod \"dbcc00a8-003c-48f3-b7e5-5bade54830fe\" (UID: \"dbcc00a8-003c-48f3-b7e5-5bade54830fe\") " Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.894716 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbcc00a8-003c-48f3-b7e5-5bade54830fe-kube-api-access-v7nqj" (OuterVolumeSpecName: "kube-api-access-v7nqj") pod "dbcc00a8-003c-48f3-b7e5-5bade54830fe" (UID: "dbcc00a8-003c-48f3-b7e5-5bade54830fe"). InnerVolumeSpecName "kube-api-access-v7nqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.912876 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-tr9f9"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.924892 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5584f4df97-wsq5t"] Oct 09 10:44:35 crc kubenswrapper[4740]: I1009 10:44:35.977662 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7nqj\" (UniqueName: \"kubernetes.io/projected/dbcc00a8-003c-48f3-b7e5-5bade54830fe-kube-api-access-v7nqj\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.185287 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4f86-account-create-b7lmp" Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.187556 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dc53-account-create-56z9f" event={"ID":"dbcc00a8-003c-48f3-b7e5-5bade54830fe","Type":"ContainerDied","Data":"986d557d0b6ffdc7e43d5c7f40ab650fba45fa5d1976af044ce41693b3b90fab"} Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.187602 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="986d557d0b6ffdc7e43d5c7f40ab650fba45fa5d1976af044ce41693b3b90fab" Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.187665 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dc53-account-create-56z9f" Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.194752 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" event={"ID":"9795efc1-c793-4c31-b34c-c86cb6ea3bca","Type":"ContainerStarted","Data":"dc6b3c9417e2b7c90b7d266ce7ec5e3d0998efce3bc091f82982aedc92084be5"} Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.207513 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vjfrh" event={"ID":"2207949f-1d38-47c3-9573-24dfa0a2db9e","Type":"ContainerStarted","Data":"2f001b33a425889125ee7a3399b4b0014d1b80e919c25180f57423e19fc321d9"} Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.210867 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4f86-account-create-b7lmp" event={"ID":"da198044-a030-4982-bdb9-9a232c4a1191","Type":"ContainerDied","Data":"c2a3538e66ea27e009c64f537bb8092b357074cda5e543e424e3d7af31173508"} Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.210920 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2a3538e66ea27e009c64f537bb8092b357074cda5e543e424e3d7af31173508" Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.210882 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4f86-account-create-b7lmp" Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.214378 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" event={"ID":"3e84b517-ac89-462b-baae-559a25766f7e","Type":"ContainerStarted","Data":"a0d632418ccc1d6b8dc1715c5034d92fac99aa4534c147c4a226f4938c9b0f58"} Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.219732 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" podUID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" containerName="dnsmasq-dns" containerID="cri-o://5dc8f80fb1a4b163e0cf256a0fa59999db07781e836d61de0481bad7dcdcf9b0" gracePeriod=10 Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.219786 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5584f4df97-wsq5t" event={"ID":"71cf9683-bcf4-4367-8365-08ef2fbe73d5","Type":"ContainerStarted","Data":"97a2a36b2293e28f4df238d885417a2bc40ede7c64b67d5200509b15d887edac"} Oct 09 10:44:36 crc kubenswrapper[4740]: W1009 10:44:36.262496 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod996070ad_888f_48c2_a368_de97d22b13c1.slice/crio-c83d181f7fcef2410fd8bb4a271ad04b25b5ba599612cb67f016c3c048e329cc WatchSource:0}: Error finding container c83d181f7fcef2410fd8bb4a271ad04b25b5ba599612cb67f016c3c048e329cc: Status 404 returned error can't find the container with id c83d181f7fcef2410fd8bb4a271ad04b25b5ba599612cb67f016c3c048e329cc Oct 09 10:44:36 crc kubenswrapper[4740]: I1009 10:44:36.266948 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.292076 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n8t7\" (UniqueName: \"kubernetes.io/projected/da198044-a030-4982-bdb9-9a232c4a1191-kube-api-access-7n8t7\") pod \"da198044-a030-4982-bdb9-9a232c4a1191\" (UID: \"da198044-a030-4982-bdb9-9a232c4a1191\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.297488 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da198044-a030-4982-bdb9-9a232c4a1191-kube-api-access-7n8t7" (OuterVolumeSpecName: "kube-api-access-7n8t7") pod "da198044-a030-4982-bdb9-9a232c4a1191" (UID: "da198044-a030-4982-bdb9-9a232c4a1191"). InnerVolumeSpecName "kube-api-access-7n8t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.340032 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dzzbk"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.352047 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.397839 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n8t7\" (UniqueName: \"kubernetes.io/projected/da198044-a030-4982-bdb9-9a232c4a1191-kube-api-access-7n8t7\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: W1009 10:44:36.401635 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe99ba98_fb4b_4609_986e_3636a4a8f244.slice/crio-351af6b7bfbd24c41f6ec34d48be2c286a2e9c30690dfdf9e9c334aeff63b093 WatchSource:0}: Error finding container 351af6b7bfbd24c41f6ec34d48be2c286a2e9c30690dfdf9e9c334aeff63b093: Status 404 returned error can't find the container with id 351af6b7bfbd24c41f6ec34d48be2c286a2e9c30690dfdf9e9c334aeff63b093 Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.459170 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d99fcb759-vc4hb"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.522645 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.622434 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.655673 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5584f4df97-wsq5t"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.694917 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.707563 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-759d8b8899-gj54k"] Oct 09 10:44:37 crc kubenswrapper[4740]: E1009 10:44:36.708084 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da198044-a030-4982-bdb9-9a232c4a1191" containerName="mariadb-account-create" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.708104 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="da198044-a030-4982-bdb9-9a232c4a1191" containerName="mariadb-account-create" Oct 09 10:44:37 crc kubenswrapper[4740]: E1009 10:44:36.708159 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbcc00a8-003c-48f3-b7e5-5bade54830fe" containerName="mariadb-account-create" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.708169 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbcc00a8-003c-48f3-b7e5-5bade54830fe" containerName="mariadb-account-create" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.708378 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="da198044-a030-4982-bdb9-9a232c4a1191" containerName="mariadb-account-create" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.708402 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbcc00a8-003c-48f3-b7e5-5bade54830fe" containerName="mariadb-account-create" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.709655 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.731857 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-759d8b8899-gj54k"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.819119 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52cfx\" (UniqueName: \"kubernetes.io/projected/91408384-50b1-4bf9-9b73-3e82a64d73d2-kube-api-access-52cfx\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.819177 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91408384-50b1-4bf9-9b73-3e82a64d73d2-horizon-secret-key\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.819224 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91408384-50b1-4bf9-9b73-3e82a64d73d2-logs\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.819338 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-scripts\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.819363 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-config-data\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.925195 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52cfx\" (UniqueName: \"kubernetes.io/projected/91408384-50b1-4bf9-9b73-3e82a64d73d2-kube-api-access-52cfx\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.925251 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91408384-50b1-4bf9-9b73-3e82a64d73d2-horizon-secret-key\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.925291 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91408384-50b1-4bf9-9b73-3e82a64d73d2-logs\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.925339 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-config-data\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.925354 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-scripts\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.925879 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91408384-50b1-4bf9-9b73-3e82a64d73d2-logs\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.926223 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-scripts\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.927139 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-config-data\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.941939 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91408384-50b1-4bf9-9b73-3e82a64d73d2-horizon-secret-key\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:36.942441 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52cfx\" (UniqueName: \"kubernetes.io/projected/91408384-50b1-4bf9-9b73-3e82a64d73d2-kube-api-access-52cfx\") pod \"horizon-759d8b8899-gj54k\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.037097 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.240849 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerStarted","Data":"f7181f6679bffc156b68aeaa09bd04438232222c522cd5250bb4a08904c37d01"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.242251 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2cdb4-f238-48ed-a0be-d3f895c72868","Type":"ContainerStarted","Data":"6ec0487084ba176f73a61d7af46f68742d6e6a78276d432db6f1e5b46fa1bedf"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.244191 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vjfrh" event={"ID":"2207949f-1d38-47c3-9573-24dfa0a2db9e","Type":"ContainerStarted","Data":"7178066fd7948f6c4b1b3fd996cad0ff8346b3903f2052712eeecd883e501538"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.247467 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"996070ad-888f-48c2-a368-de97d22b13c1","Type":"ContainerStarted","Data":"415d4466d534f91f0399efa0e35241f9c1630c6f916f374e910d9280cb5c8fd3"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.247688 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"996070ad-888f-48c2-a368-de97d22b13c1","Type":"ContainerStarted","Data":"c83d181f7fcef2410fd8bb4a271ad04b25b5ba599612cb67f016c3c048e329cc"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.252647 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzzbk" event={"ID":"be99ba98-fb4b-4609-986e-3636a4a8f244","Type":"ContainerStarted","Data":"351af6b7bfbd24c41f6ec34d48be2c286a2e9c30690dfdf9e9c334aeff63b093"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.253464 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d99fcb759-vc4hb" event={"ID":"6176777d-0028-4420-a29c-cbf0b361c378","Type":"ContainerStarted","Data":"9e158db5faa03782690e2afd15acd764430bb6ea776aa3780e58b01c7ab62ee6"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.254738 4740 generic.go:334] "Generic (PLEG): container finished" podID="9795efc1-c793-4c31-b34c-c86cb6ea3bca" containerID="2cf321b442691b6adfbbd0d2f63692a546dd2144ada6f8fc1961ff8dd78fd5ec" exitCode=0 Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.254862 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" event={"ID":"9795efc1-c793-4c31-b34c-c86cb6ea3bca","Type":"ContainerDied","Data":"2cf321b442691b6adfbbd0d2f63692a546dd2144ada6f8fc1961ff8dd78fd5ec"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.265477 4740 generic.go:334] "Generic (PLEG): container finished" podID="3e84b517-ac89-462b-baae-559a25766f7e" containerID="37751f85cf80f7d38c7d643d59cc8ee715c78a318f2c9b516e7e2c741d6b58b4" exitCode=0 Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.265590 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" event={"ID":"3e84b517-ac89-462b-baae-559a25766f7e","Type":"ContainerDied","Data":"37751f85cf80f7d38c7d643d59cc8ee715c78a318f2c9b516e7e2c741d6b58b4"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.266407 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vjfrh" podStartSLOduration=3.266394706 podStartE2EDuration="3.266394706s" podCreationTimestamp="2025-10-09 10:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:44:37.25746611 +0000 UTC m=+1016.219666491" watchObservedRunningTime="2025-10-09 10:44:37.266394706 +0000 UTC m=+1016.228595087" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.268709 4740 generic.go:334] "Generic (PLEG): container finished" podID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" containerID="5dc8f80fb1a4b163e0cf256a0fa59999db07781e836d61de0481bad7dcdcf9b0" exitCode=0 Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.269076 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" event={"ID":"08b9d2f6-45a1-4325-97a9-770a2f727ab4","Type":"ContainerDied","Data":"5dc8f80fb1a4b163e0cf256a0fa59999db07781e836d61de0481bad7dcdcf9b0"} Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.450892 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.527443 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.635340 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-svc\") pod \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.635709 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-nb\") pod \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.635779 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwbqp\" (UniqueName: \"kubernetes.io/projected/08b9d2f6-45a1-4325-97a9-770a2f727ab4-kube-api-access-xwbqp\") pod \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.635834 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-swift-storage-0\") pod \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.635884 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-config\") pod \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.635949 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-sb\") pod \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\" (UID: \"08b9d2f6-45a1-4325-97a9-770a2f727ab4\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.640073 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-759d8b8899-gj54k"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.645219 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b9d2f6-45a1-4325-97a9-770a2f727ab4-kube-api-access-xwbqp" (OuterVolumeSpecName: "kube-api-access-xwbqp") pod "08b9d2f6-45a1-4325-97a9-770a2f727ab4" (UID: "08b9d2f6-45a1-4325-97a9-770a2f727ab4"). InnerVolumeSpecName "kube-api-access-xwbqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.713134 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.738256 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwbqp\" (UniqueName: \"kubernetes.io/projected/08b9d2f6-45a1-4325-97a9-770a2f727ab4-kube-api-access-xwbqp\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.747400 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08b9d2f6-45a1-4325-97a9-770a2f727ab4" (UID: "08b9d2f6-45a1-4325-97a9-770a2f727ab4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.763794 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08b9d2f6-45a1-4325-97a9-770a2f727ab4" (UID: "08b9d2f6-45a1-4325-97a9-770a2f727ab4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.763902 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08b9d2f6-45a1-4325-97a9-770a2f727ab4" (UID: "08b9d2f6-45a1-4325-97a9-770a2f727ab4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.792484 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-config" (OuterVolumeSpecName: "config") pod "08b9d2f6-45a1-4325-97a9-770a2f727ab4" (UID: "08b9d2f6-45a1-4325-97a9-770a2f727ab4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.799076 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08b9d2f6-45a1-4325-97a9-770a2f727ab4" (UID: "08b9d2f6-45a1-4325-97a9-770a2f727ab4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.839359 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-nb\") pod \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.839405 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n92m\" (UniqueName: \"kubernetes.io/projected/9795efc1-c793-4c31-b34c-c86cb6ea3bca-kube-api-access-4n92m\") pod \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.839454 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-svc\") pod \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.839490 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-config\") pod \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.839530 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-swift-storage-0\") pod \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.839651 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-sb\") pod \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\" (UID: \"9795efc1-c793-4c31-b34c-c86cb6ea3bca\") " Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.840729 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.840747 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.840773 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.840783 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.840792 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08b9d2f6-45a1-4325-97a9-770a2f727ab4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.844308 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9795efc1-c793-4c31-b34c-c86cb6ea3bca-kube-api-access-4n92m" (OuterVolumeSpecName: "kube-api-access-4n92m") pod "9795efc1-c793-4c31-b34c-c86cb6ea3bca" (UID: "9795efc1-c793-4c31-b34c-c86cb6ea3bca"). InnerVolumeSpecName "kube-api-access-4n92m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.871748 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9795efc1-c793-4c31-b34c-c86cb6ea3bca" (UID: "9795efc1-c793-4c31-b34c-c86cb6ea3bca"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.874791 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9795efc1-c793-4c31-b34c-c86cb6ea3bca" (UID: "9795efc1-c793-4c31-b34c-c86cb6ea3bca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.875595 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9795efc1-c793-4c31-b34c-c86cb6ea3bca" (UID: "9795efc1-c793-4c31-b34c-c86cb6ea3bca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.878310 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-config" (OuterVolumeSpecName: "config") pod "9795efc1-c793-4c31-b34c-c86cb6ea3bca" (UID: "9795efc1-c793-4c31-b34c-c86cb6ea3bca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.926405 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9795efc1-c793-4c31-b34c-c86cb6ea3bca" (UID: "9795efc1-c793-4c31-b34c-c86cb6ea3bca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.942069 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.942110 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n92m\" (UniqueName: \"kubernetes.io/projected/9795efc1-c793-4c31-b34c-c86cb6ea3bca-kube-api-access-4n92m\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.942125 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.942136 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.942146 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.942157 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9795efc1-c793-4c31-b34c-c86cb6ea3bca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.976913 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mw6z4"] Oct 09 10:44:37 crc kubenswrapper[4740]: E1009 10:44:37.988505 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" containerName="init" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.988546 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" containerName="init" Oct 09 10:44:37 crc kubenswrapper[4740]: E1009 10:44:37.988561 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" containerName="dnsmasq-dns" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.988569 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" containerName="dnsmasq-dns" Oct 09 10:44:37 crc kubenswrapper[4740]: E1009 10:44:37.988598 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9795efc1-c793-4c31-b34c-c86cb6ea3bca" containerName="init" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.988606 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9795efc1-c793-4c31-b34c-c86cb6ea3bca" containerName="init" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.988935 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9795efc1-c793-4c31-b34c-c86cb6ea3bca" containerName="init" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.988967 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" containerName="dnsmasq-dns" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.989529 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ktvhb"] Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.990458 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.991852 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.995619 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.996002 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-89nf8" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.996234 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.996387 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.996596 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-69htd" Oct 09 10:44:37 crc kubenswrapper[4740]: I1009 10:44:37.997259 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ktvhb"] Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.006379 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mw6z4"] Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.145833 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-db-sync-config-data\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.146178 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl4mj\" (UniqueName: \"kubernetes.io/projected/71a8fb50-724c-4b07-83e2-71d8ee90cb05-kube-api-access-sl4mj\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.146251 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-combined-ca-bundle\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.146308 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-config-data\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.146338 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3062e734-0f07-4e8f-862e-a2906e7bbbd5-etc-machine-id\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.146377 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-scripts\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.146397 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5mb7\" (UniqueName: \"kubernetes.io/projected/3062e734-0f07-4e8f-862e-a2906e7bbbd5-kube-api-access-h5mb7\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.146426 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-combined-ca-bundle\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.146445 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-db-sync-config-data\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247338 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-combined-ca-bundle\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247399 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-config-data\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247443 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3062e734-0f07-4e8f-862e-a2906e7bbbd5-etc-machine-id\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247461 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-scripts\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247477 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5mb7\" (UniqueName: \"kubernetes.io/projected/3062e734-0f07-4e8f-862e-a2906e7bbbd5-kube-api-access-h5mb7\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247495 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-combined-ca-bundle\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247514 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-db-sync-config-data\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247534 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-db-sync-config-data\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.247567 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl4mj\" (UniqueName: \"kubernetes.io/projected/71a8fb50-724c-4b07-83e2-71d8ee90cb05-kube-api-access-sl4mj\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.249062 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3062e734-0f07-4e8f-862e-a2906e7bbbd5-etc-machine-id\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.253351 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-db-sync-config-data\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.253808 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-scripts\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.257586 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-combined-ca-bundle\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.257933 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-db-sync-config-data\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.258716 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-config-data\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.262699 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-combined-ca-bundle\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.279472 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5mb7\" (UniqueName: \"kubernetes.io/projected/3062e734-0f07-4e8f-862e-a2906e7bbbd5-kube-api-access-h5mb7\") pod \"cinder-db-sync-mw6z4\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.283669 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl4mj\" (UniqueName: \"kubernetes.io/projected/71a8fb50-724c-4b07-83e2-71d8ee90cb05-kube-api-access-sl4mj\") pod \"barbican-db-sync-ktvhb\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.297238 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" event={"ID":"3e84b517-ac89-462b-baae-559a25766f7e","Type":"ContainerStarted","Data":"6fa0c77057ccff8c499f02dc8114209b7aa046ee19a37dfaf9c5852a6a693f4a"} Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.298579 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-759d8b8899-gj54k" event={"ID":"91408384-50b1-4bf9-9b73-3e82a64d73d2","Type":"ContainerStarted","Data":"7b8a1fea369826818f62652946db58cf12934c3ada704ae21c0526e57adf5770"} Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.300366 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" event={"ID":"9795efc1-c793-4c31-b34c-c86cb6ea3bca","Type":"ContainerDied","Data":"dc6b3c9417e2b7c90b7d266ce7ec5e3d0998efce3bc091f82982aedc92084be5"} Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.300408 4740 scope.go:117] "RemoveContainer" containerID="2cf321b442691b6adfbbd0d2f63692a546dd2144ada6f8fc1961ff8dd78fd5ec" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.300575 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.326194 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" event={"ID":"08b9d2f6-45a1-4325-97a9-770a2f727ab4","Type":"ContainerDied","Data":"ccf3833ae5953b4c792c73ded601adbbe44c59146a71b07a43879dbea3f6a9f3"} Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.326340 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-nrzhg" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.356084 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2cdb4-f238-48ed-a0be-d3f895c72868","Type":"ContainerStarted","Data":"e17fae89ddc741aa737630e335cb189eb39baa06bacba265f8841f1c307cb442"} Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.356655 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.368283 4740 scope.go:117] "RemoveContainer" containerID="5dc8f80fb1a4b163e0cf256a0fa59999db07781e836d61de0481bad7dcdcf9b0" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.373344 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.411090 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j"] Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.420980 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-lzr2j"] Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.428506 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-nrzhg"] Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.431190 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-nrzhg"] Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.443431 4740 scope.go:117] "RemoveContainer" containerID="3617c9d6219416895d96401839488d8cc4b33d7d1cbcf33b25dc2ae7ea7e0ede" Oct 09 10:44:38 crc kubenswrapper[4740]: I1009 10:44:38.925002 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ktvhb"] Oct 09 10:44:38 crc kubenswrapper[4740]: W1009 10:44:38.939935 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71a8fb50_724c_4b07_83e2_71d8ee90cb05.slice/crio-0ace26903fb13a6d02f7c4d59441e3ad1ea0b954440ed07f16b30c0fa59f899f WatchSource:0}: Error finding container 0ace26903fb13a6d02f7c4d59441e3ad1ea0b954440ed07f16b30c0fa59f899f: Status 404 returned error can't find the container with id 0ace26903fb13a6d02f7c4d59441e3ad1ea0b954440ed07f16b30c0fa59f899f Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.018181 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mw6z4"] Oct 09 10:44:39 crc kubenswrapper[4740]: W1009 10:44:39.025179 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3062e734_0f07_4e8f_862e_a2906e7bbbd5.slice/crio-6ad8139eca0f15fd8e30c7156fa7c4a3023f3f5ea90440d6cbdb5588237fada0 WatchSource:0}: Error finding container 6ad8139eca0f15fd8e30c7156fa7c4a3023f3f5ea90440d6cbdb5588237fada0: Status 404 returned error can't find the container with id 6ad8139eca0f15fd8e30c7156fa7c4a3023f3f5ea90440d6cbdb5588237fada0 Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.369942 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2cdb4-f238-48ed-a0be-d3f895c72868","Type":"ContainerStarted","Data":"da0aa88afd1cb03c1ec2765072b38138de8e4e7997f33a81f9bafc5b025f0b70"} Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.370100 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerName="glance-log" containerID="cri-o://e17fae89ddc741aa737630e335cb189eb39baa06bacba265f8841f1c307cb442" gracePeriod=30 Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.370433 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerName="glance-httpd" containerID="cri-o://da0aa88afd1cb03c1ec2765072b38138de8e4e7997f33a81f9bafc5b025f0b70" gracePeriod=30 Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.379058 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ktvhb" event={"ID":"71a8fb50-724c-4b07-83e2-71d8ee90cb05","Type":"ContainerStarted","Data":"0ace26903fb13a6d02f7c4d59441e3ad1ea0b954440ed07f16b30c0fa59f899f"} Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.380844 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mw6z4" event={"ID":"3062e734-0f07-4e8f-862e-a2906e7bbbd5","Type":"ContainerStarted","Data":"6ad8139eca0f15fd8e30c7156fa7c4a3023f3f5ea90440d6cbdb5588237fada0"} Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.384945 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="996070ad-888f-48c2-a368-de97d22b13c1" containerName="glance-log" containerID="cri-o://415d4466d534f91f0399efa0e35241f9c1630c6f916f374e910d9280cb5c8fd3" gracePeriod=30 Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.385217 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"996070ad-888f-48c2-a368-de97d22b13c1","Type":"ContainerStarted","Data":"a227a6af01e367af25d7c0b48d06e3e5c83af5055533d6ee2672d2efc92792d9"} Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.385250 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.385293 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="996070ad-888f-48c2-a368-de97d22b13c1" containerName="glance-httpd" containerID="cri-o://a227a6af01e367af25d7c0b48d06e3e5c83af5055533d6ee2672d2efc92792d9" gracePeriod=30 Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.396293 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.396279174 podStartE2EDuration="5.396279174s" podCreationTimestamp="2025-10-09 10:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:44:39.395361599 +0000 UTC m=+1018.357562000" watchObservedRunningTime="2025-10-09 10:44:39.396279174 +0000 UTC m=+1018.358479555" Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.425204 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.425186689 podStartE2EDuration="5.425186689s" podCreationTimestamp="2025-10-09 10:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:44:39.422219667 +0000 UTC m=+1018.384420048" watchObservedRunningTime="2025-10-09 10:44:39.425186689 +0000 UTC m=+1018.387387070" Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.443299 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" podStartSLOduration=5.443284647 podStartE2EDuration="5.443284647s" podCreationTimestamp="2025-10-09 10:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:44:39.442129835 +0000 UTC m=+1018.404330216" watchObservedRunningTime="2025-10-09 10:44:39.443284647 +0000 UTC m=+1018.405485028" Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.766664 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b9d2f6-45a1-4325-97a9-770a2f727ab4" path="/var/lib/kubelet/pods/08b9d2f6-45a1-4325-97a9-770a2f727ab4/volumes" Oct 09 10:44:39 crc kubenswrapper[4740]: I1009 10:44:39.767900 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9795efc1-c793-4c31-b34c-c86cb6ea3bca" path="/var/lib/kubelet/pods/9795efc1-c793-4c31-b34c-c86cb6ea3bca/volumes" Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.393061 4740 generic.go:334] "Generic (PLEG): container finished" podID="996070ad-888f-48c2-a368-de97d22b13c1" containerID="a227a6af01e367af25d7c0b48d06e3e5c83af5055533d6ee2672d2efc92792d9" exitCode=0 Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.393093 4740 generic.go:334] "Generic (PLEG): container finished" podID="996070ad-888f-48c2-a368-de97d22b13c1" containerID="415d4466d534f91f0399efa0e35241f9c1630c6f916f374e910d9280cb5c8fd3" exitCode=143 Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.393141 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"996070ad-888f-48c2-a368-de97d22b13c1","Type":"ContainerDied","Data":"a227a6af01e367af25d7c0b48d06e3e5c83af5055533d6ee2672d2efc92792d9"} Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.393182 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"996070ad-888f-48c2-a368-de97d22b13c1","Type":"ContainerDied","Data":"415d4466d534f91f0399efa0e35241f9c1630c6f916f374e910d9280cb5c8fd3"} Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.396104 4740 generic.go:334] "Generic (PLEG): container finished" podID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerID="da0aa88afd1cb03c1ec2765072b38138de8e4e7997f33a81f9bafc5b025f0b70" exitCode=0 Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.396121 4740 generic.go:334] "Generic (PLEG): container finished" podID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerID="e17fae89ddc741aa737630e335cb189eb39baa06bacba265f8841f1c307cb442" exitCode=143 Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.396168 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2cdb4-f238-48ed-a0be-d3f895c72868","Type":"ContainerDied","Data":"da0aa88afd1cb03c1ec2765072b38138de8e4e7997f33a81f9bafc5b025f0b70"} Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.396194 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2cdb4-f238-48ed-a0be-d3f895c72868","Type":"ContainerDied","Data":"e17fae89ddc741aa737630e335cb189eb39baa06bacba265f8841f1c307cb442"} Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.398283 4740 generic.go:334] "Generic (PLEG): container finished" podID="2207949f-1d38-47c3-9573-24dfa0a2db9e" containerID="7178066fd7948f6c4b1b3fd996cad0ff8346b3903f2052712eeecd883e501538" exitCode=0 Oct 09 10:44:40 crc kubenswrapper[4740]: I1009 10:44:40.398379 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vjfrh" event={"ID":"2207949f-1d38-47c3-9573-24dfa0a2db9e","Type":"ContainerDied","Data":"7178066fd7948f6c4b1b3fd996cad0ff8346b3903f2052712eeecd883e501538"} Oct 09 10:44:41 crc kubenswrapper[4740]: I1009 10:44:41.992974 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.002264 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.136856 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-combined-ca-bundle\") pod \"2207949f-1d38-47c3-9573-24dfa0a2db9e\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.136932 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-combined-ca-bundle\") pod \"996070ad-888f-48c2-a368-de97d22b13c1\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.137080 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-config-data\") pod \"996070ad-888f-48c2-a368-de97d22b13c1\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.137105 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-fernet-keys\") pod \"2207949f-1d38-47c3-9573-24dfa0a2db9e\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.137140 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t2xv\" (UniqueName: \"kubernetes.io/projected/996070ad-888f-48c2-a368-de97d22b13c1-kube-api-access-6t2xv\") pod \"996070ad-888f-48c2-a368-de97d22b13c1\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.137231 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-public-tls-certs\") pod \"996070ad-888f-48c2-a368-de97d22b13c1\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.137260 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-config-data\") pod \"2207949f-1d38-47c3-9573-24dfa0a2db9e\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.137330 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"996070ad-888f-48c2-a368-de97d22b13c1\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.137549 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-scripts\") pod \"996070ad-888f-48c2-a368-de97d22b13c1\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.137612 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z69m\" (UniqueName: \"kubernetes.io/projected/2207949f-1d38-47c3-9573-24dfa0a2db9e-kube-api-access-7z69m\") pod \"2207949f-1d38-47c3-9573-24dfa0a2db9e\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.138178 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-logs\") pod \"996070ad-888f-48c2-a368-de97d22b13c1\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.138204 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-httpd-run\") pod \"996070ad-888f-48c2-a368-de97d22b13c1\" (UID: \"996070ad-888f-48c2-a368-de97d22b13c1\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.138228 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-scripts\") pod \"2207949f-1d38-47c3-9573-24dfa0a2db9e\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.138257 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-credential-keys\") pod \"2207949f-1d38-47c3-9573-24dfa0a2db9e\" (UID: \"2207949f-1d38-47c3-9573-24dfa0a2db9e\") " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.140253 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "996070ad-888f-48c2-a368-de97d22b13c1" (UID: "996070ad-888f-48c2-a368-de97d22b13c1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.140540 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-logs" (OuterVolumeSpecName: "logs") pod "996070ad-888f-48c2-a368-de97d22b13c1" (UID: "996070ad-888f-48c2-a368-de97d22b13c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.143817 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996070ad-888f-48c2-a368-de97d22b13c1-kube-api-access-6t2xv" (OuterVolumeSpecName: "kube-api-access-6t2xv") pod "996070ad-888f-48c2-a368-de97d22b13c1" (UID: "996070ad-888f-48c2-a368-de97d22b13c1"). InnerVolumeSpecName "kube-api-access-6t2xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.145904 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-scripts" (OuterVolumeSpecName: "scripts") pod "996070ad-888f-48c2-a368-de97d22b13c1" (UID: "996070ad-888f-48c2-a368-de97d22b13c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.148767 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-scripts" (OuterVolumeSpecName: "scripts") pod "2207949f-1d38-47c3-9573-24dfa0a2db9e" (UID: "2207949f-1d38-47c3-9573-24dfa0a2db9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.149202 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2207949f-1d38-47c3-9573-24dfa0a2db9e-kube-api-access-7z69m" (OuterVolumeSpecName: "kube-api-access-7z69m") pod "2207949f-1d38-47c3-9573-24dfa0a2db9e" (UID: "2207949f-1d38-47c3-9573-24dfa0a2db9e"). InnerVolumeSpecName "kube-api-access-7z69m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.149483 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2207949f-1d38-47c3-9573-24dfa0a2db9e" (UID: "2207949f-1d38-47c3-9573-24dfa0a2db9e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.152781 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "996070ad-888f-48c2-a368-de97d22b13c1" (UID: "996070ad-888f-48c2-a368-de97d22b13c1"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.164444 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2207949f-1d38-47c3-9573-24dfa0a2db9e" (UID: "2207949f-1d38-47c3-9573-24dfa0a2db9e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.168372 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-config-data" (OuterVolumeSpecName: "config-data") pod "2207949f-1d38-47c3-9573-24dfa0a2db9e" (UID: "2207949f-1d38-47c3-9573-24dfa0a2db9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.181699 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2207949f-1d38-47c3-9573-24dfa0a2db9e" (UID: "2207949f-1d38-47c3-9573-24dfa0a2db9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.190684 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "996070ad-888f-48c2-a368-de97d22b13c1" (UID: "996070ad-888f-48c2-a368-de97d22b13c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.192062 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "996070ad-888f-48c2-a368-de97d22b13c1" (UID: "996070ad-888f-48c2-a368-de97d22b13c1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.193643 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-config-data" (OuterVolumeSpecName: "config-data") pod "996070ad-888f-48c2-a368-de97d22b13c1" (UID: "996070ad-888f-48c2-a368-de97d22b13c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.241980 4740 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242011 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t2xv\" (UniqueName: \"kubernetes.io/projected/996070ad-888f-48c2-a368-de97d22b13c1-kube-api-access-6t2xv\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242025 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242059 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242092 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242101 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242111 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z69m\" (UniqueName: \"kubernetes.io/projected/2207949f-1d38-47c3-9573-24dfa0a2db9e-kube-api-access-7z69m\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242121 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242129 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/996070ad-888f-48c2-a368-de97d22b13c1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242137 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242146 4740 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242155 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2207949f-1d38-47c3-9573-24dfa0a2db9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242165 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.242175 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996070ad-888f-48c2-a368-de97d22b13c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.261635 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.348244 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.426998 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vjfrh" event={"ID":"2207949f-1d38-47c3-9573-24dfa0a2db9e","Type":"ContainerDied","Data":"2f001b33a425889125ee7a3399b4b0014d1b80e919c25180f57423e19fc321d9"} Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.427044 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f001b33a425889125ee7a3399b4b0014d1b80e919c25180f57423e19fc321d9" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.427114 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vjfrh" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.435638 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"996070ad-888f-48c2-a368-de97d22b13c1","Type":"ContainerDied","Data":"c83d181f7fcef2410fd8bb4a271ad04b25b5ba599612cb67f016c3c048e329cc"} Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.435685 4740 scope.go:117] "RemoveContainer" containerID="a227a6af01e367af25d7c0b48d06e3e5c83af5055533d6ee2672d2efc92792d9" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.435826 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.495322 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vjfrh"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.506433 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vjfrh"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.522687 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.530659 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.540241 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:42 crc kubenswrapper[4740]: E1009 10:44:42.540649 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996070ad-888f-48c2-a368-de97d22b13c1" containerName="glance-httpd" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.540660 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="996070ad-888f-48c2-a368-de97d22b13c1" containerName="glance-httpd" Oct 09 10:44:42 crc kubenswrapper[4740]: E1009 10:44:42.540675 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996070ad-888f-48c2-a368-de97d22b13c1" containerName="glance-log" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.540680 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="996070ad-888f-48c2-a368-de97d22b13c1" containerName="glance-log" Oct 09 10:44:42 crc kubenswrapper[4740]: E1009 10:44:42.540689 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2207949f-1d38-47c3-9573-24dfa0a2db9e" containerName="keystone-bootstrap" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.540695 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2207949f-1d38-47c3-9573-24dfa0a2db9e" containerName="keystone-bootstrap" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.540894 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2207949f-1d38-47c3-9573-24dfa0a2db9e" containerName="keystone-bootstrap" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.540914 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="996070ad-888f-48c2-a368-de97d22b13c1" containerName="glance-log" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.540926 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="996070ad-888f-48c2-a368-de97d22b13c1" containerName="glance-httpd" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.543810 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.546123 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.548524 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.548717 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.670248 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-scripts\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.670375 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.670405 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.670613 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5mk9\" (UniqueName: \"kubernetes.io/projected/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-kube-api-access-s5mk9\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.670636 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-logs\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.670667 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.670685 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-config-data\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.670715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.696880 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wkc7b"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.698938 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.704461 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.704485 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52ljb" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.704657 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.704696 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.704930 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wkc7b"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.771696 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-scripts\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.771749 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.771912 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.771960 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.772047 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-config-data\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.772098 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-fernet-keys\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.772123 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-credential-keys\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.772199 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5mk9\" (UniqueName: \"kubernetes.io/projected/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-kube-api-access-s5mk9\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.772304 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-logs\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.772344 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-combined-ca-bundle\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.772814 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-logs\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.772884 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.773121 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-config-data\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.773898 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.774002 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-scripts\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.774025 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgg5d\" (UniqueName: \"kubernetes.io/projected/4808b047-cb78-4910-8c22-65514e99c2cc-kube-api-access-pgg5d\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.773088 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.798413 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.807055 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-config-data\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.810998 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-scripts\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.821076 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.823234 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5mk9\" (UniqueName: \"kubernetes.io/projected/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-kube-api-access-s5mk9\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.830664 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.880357 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-scripts\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.880439 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-config-data\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.880457 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-fernet-keys\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.880474 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-credential-keys\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.880505 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-combined-ca-bundle\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.880592 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgg5d\" (UniqueName: \"kubernetes.io/projected/4808b047-cb78-4910-8c22-65514e99c2cc-kube-api-access-pgg5d\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.885927 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-scripts\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.886241 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-fernet-keys\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.886807 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c7a9-account-create-6np7n"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.887531 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-credential-keys\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.887636 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-combined-ca-bundle\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.888070 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7a9-account-create-6np7n" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.890475 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.891400 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-config-data\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.898655 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgg5d\" (UniqueName: \"kubernetes.io/projected/4808b047-cb78-4910-8c22-65514e99c2cc-kube-api-access-pgg5d\") pod \"keystone-bootstrap-wkc7b\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.912133 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c7a9-account-create-6np7n"] Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.919908 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:44:42 crc kubenswrapper[4740]: I1009 10:44:42.981914 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5rg5\" (UniqueName: \"kubernetes.io/projected/5caee6ca-48fd-48a5-b84c-81d04b03a650-kube-api-access-k5rg5\") pod \"neutron-c7a9-account-create-6np7n\" (UID: \"5caee6ca-48fd-48a5-b84c-81d04b03a650\") " pod="openstack/neutron-c7a9-account-create-6np7n" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.029833 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.083937 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5rg5\" (UniqueName: \"kubernetes.io/projected/5caee6ca-48fd-48a5-b84c-81d04b03a650-kube-api-access-k5rg5\") pod \"neutron-c7a9-account-create-6np7n\" (UID: \"5caee6ca-48fd-48a5-b84c-81d04b03a650\") " pod="openstack/neutron-c7a9-account-create-6np7n" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.115361 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5rg5\" (UniqueName: \"kubernetes.io/projected/5caee6ca-48fd-48a5-b84c-81d04b03a650-kube-api-access-k5rg5\") pod \"neutron-c7a9-account-create-6np7n\" (UID: \"5caee6ca-48fd-48a5-b84c-81d04b03a650\") " pod="openstack/neutron-c7a9-account-create-6np7n" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.145870 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d99fcb759-vc4hb"] Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.192452 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f67cbf644-2n99k"] Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.194251 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.196219 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.198201 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.207287 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f67cbf644-2n99k"] Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.265776 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-759d8b8899-gj54k"] Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.278254 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7a9-account-create-6np7n" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.285276 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5dd4b95776-lcxbt"] Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.286800 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.288119 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-scripts\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.288388 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-combined-ca-bundle\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.288433 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlfrb\" (UniqueName: \"kubernetes.io/projected/5d46647a-6230-4561-bd21-a433ed55dad2-kube-api-access-qlfrb\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.288471 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d46647a-6230-4561-bd21-a433ed55dad2-logs\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.288493 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-config-data\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.288511 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-tls-certs\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.288583 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-secret-key\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.301048 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dd4b95776-lcxbt"] Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.390847 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-horizon-tls-certs\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.390952 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-secret-key\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.390999 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjnzn\" (UniqueName: \"kubernetes.io/projected/3762ae93-7451-4d99-aad4-f9c68666cf40-kube-api-access-sjnzn\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391026 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-horizon-secret-key\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391096 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3762ae93-7451-4d99-aad4-f9c68666cf40-logs\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391143 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-scripts\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391166 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-combined-ca-bundle\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391204 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-combined-ca-bundle\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391235 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlfrb\" (UniqueName: \"kubernetes.io/projected/5d46647a-6230-4561-bd21-a433ed55dad2-kube-api-access-qlfrb\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391331 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3762ae93-7451-4d99-aad4-f9c68666cf40-config-data\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391360 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d46647a-6230-4561-bd21-a433ed55dad2-logs\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391390 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3762ae93-7451-4d99-aad4-f9c68666cf40-scripts\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391415 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-config-data\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.391442 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-tls-certs\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.392958 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d46647a-6230-4561-bd21-a433ed55dad2-logs\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.393873 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-config-data\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.395747 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-tls-certs\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.396718 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-scripts\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.411116 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-combined-ca-bundle\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.415282 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-secret-key\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.419996 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlfrb\" (UniqueName: \"kubernetes.io/projected/5d46647a-6230-4561-bd21-a433ed55dad2-kube-api-access-qlfrb\") pod \"horizon-6f67cbf644-2n99k\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.493654 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3762ae93-7451-4d99-aad4-f9c68666cf40-config-data\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.493715 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3762ae93-7451-4d99-aad4-f9c68666cf40-scripts\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.494046 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-horizon-tls-certs\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.494210 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjnzn\" (UniqueName: \"kubernetes.io/projected/3762ae93-7451-4d99-aad4-f9c68666cf40-kube-api-access-sjnzn\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.494232 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-horizon-secret-key\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.494265 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3762ae93-7451-4d99-aad4-f9c68666cf40-logs\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.494326 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-combined-ca-bundle\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.495841 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3762ae93-7451-4d99-aad4-f9c68666cf40-scripts\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.496052 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3762ae93-7451-4d99-aad4-f9c68666cf40-logs\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.497790 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3762ae93-7451-4d99-aad4-f9c68666cf40-config-data\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.499152 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-horizon-secret-key\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.504042 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-horizon-tls-certs\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.504543 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3762ae93-7451-4d99-aad4-f9c68666cf40-combined-ca-bundle\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.512614 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjnzn\" (UniqueName: \"kubernetes.io/projected/3762ae93-7451-4d99-aad4-f9c68666cf40-kube-api-access-sjnzn\") pod \"horizon-5dd4b95776-lcxbt\" (UID: \"3762ae93-7451-4d99-aad4-f9c68666cf40\") " pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.515589 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.622258 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.786828 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2207949f-1d38-47c3-9573-24dfa0a2db9e" path="/var/lib/kubelet/pods/2207949f-1d38-47c3-9573-24dfa0a2db9e/volumes" Oct 09 10:44:43 crc kubenswrapper[4740]: I1009 10:44:43.787643 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996070ad-888f-48c2-a368-de97d22b13c1" path="/var/lib/kubelet/pods/996070ad-888f-48c2-a368-de97d22b13c1/volumes" Oct 09 10:44:45 crc kubenswrapper[4740]: I1009 10:44:45.006912 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:44:45 crc kubenswrapper[4740]: I1009 10:44:45.063331 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-66lt6"] Oct 09 10:44:45 crc kubenswrapper[4740]: I1009 10:44:45.063542 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="dnsmasq-dns" containerID="cri-o://e46f42a51bb24b608abd5dfea0bc4323aa6a816650f21f47036ac71ea4eebe4e" gracePeriod=10 Oct 09 10:44:46 crc kubenswrapper[4740]: I1009 10:44:46.474888 4740 generic.go:334] "Generic (PLEG): container finished" podID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerID="e46f42a51bb24b608abd5dfea0bc4323aa6a816650f21f47036ac71ea4eebe4e" exitCode=0 Oct 09 10:44:46 crc kubenswrapper[4740]: I1009 10:44:46.474974 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" event={"ID":"d40ec31e-4a50-4dae-a2b7-e48354125946","Type":"ContainerDied","Data":"e46f42a51bb24b608abd5dfea0bc4323aa6a816650f21f47036ac71ea4eebe4e"} Oct 09 10:44:49 crc kubenswrapper[4740]: I1009 10:44:49.474266 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Oct 09 10:44:54 crc kubenswrapper[4740]: I1009 10:44:54.474434 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Oct 09 10:44:56 crc kubenswrapper[4740]: E1009 10:44:56.303343 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 09 10:44:56 crc kubenswrapper[4740]: E1009 10:44:56.304118 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n597h5dh646h665h67fh89h5ch79hbbh579h694hffh5c7h646h7fh579h675h58fh5bch645h59ch5c4h5fch5fh9dh5ch5b7hc7h84hcch9h7dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-js2kx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5584f4df97-wsq5t_openstack(71cf9683-bcf4-4367-8365-08ef2fbe73d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 10:44:56 crc kubenswrapper[4740]: E1009 10:44:56.320163 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5584f4df97-wsq5t" podUID="71cf9683-bcf4-4367-8365-08ef2fbe73d5" Oct 09 10:44:56 crc kubenswrapper[4740]: E1009 10:44:56.339981 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 09 10:44:56 crc kubenswrapper[4740]: E1009 10:44:56.340192 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd8h97h5dfhc8hdbhc5h68fh65chb8h88hc6h655h59chddh658h57dh5cbh9chcfh6hdch684h5dh584h586h55fh545h59fhf9h654h56dh7dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-52cfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-759d8b8899-gj54k_openstack(91408384-50b1-4bf9-9b73-3e82a64d73d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 10:44:56 crc kubenswrapper[4740]: E1009 10:44:56.357450 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-759d8b8899-gj54k" podUID="91408384-50b1-4bf9-9b73-3e82a64d73d2" Oct 09 10:44:58 crc kubenswrapper[4740]: E1009 10:44:58.891719 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 09 10:44:58 crc kubenswrapper[4740]: E1009 10:44:58.892688 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl4mj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-ktvhb_openstack(71a8fb50-724c-4b07-83e2-71d8ee90cb05): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 10:44:58 crc kubenswrapper[4740]: E1009 10:44:58.893801 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-ktvhb" podUID="71a8fb50-724c-4b07-83e2-71d8ee90cb05" Oct 09 10:44:58 crc kubenswrapper[4740]: E1009 10:44:58.909170 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 09 10:44:58 crc kubenswrapper[4740]: E1009 10:44:58.909317 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n694h574h68ch58fh5dh5f8hf7h644h5b8h57h54ch677h574h5c4h64h64ch579h669h68dh5c9hcdh6bh5bbh576h5f4h4h9fh65dhb6h9bh55bh647q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l582g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6d99fcb759-vc4hb_openstack(6176777d-0028-4420-a29c-cbf0b361c378): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 10:44:58 crc kubenswrapper[4740]: E1009 10:44:58.919418 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6d99fcb759-vc4hb" podUID="6176777d-0028-4420-a29c-cbf0b361c378" Oct 09 10:44:58 crc kubenswrapper[4740]: I1009 10:44:58.919536 4740 scope.go:117] "RemoveContainer" containerID="415d4466d534f91f0399efa0e35241f9c1630c6f916f374e910d9280cb5c8fd3" Oct 09 10:44:58 crc kubenswrapper[4740]: I1009 10:44:58.987304 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.114415 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.114506 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-scripts\") pod \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.114664 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-combined-ca-bundle\") pod \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.114740 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbpq6\" (UniqueName: \"kubernetes.io/projected/d9a2cdb4-f238-48ed-a0be-d3f895c72868-kube-api-access-mbpq6\") pod \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.114850 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-httpd-run\") pod \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.115419 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d9a2cdb4-f238-48ed-a0be-d3f895c72868" (UID: "d9a2cdb4-f238-48ed-a0be-d3f895c72868"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.115573 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-internal-tls-certs\") pod \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.116220 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-config-data\") pod \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.117003 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-logs\") pod \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\" (UID: \"d9a2cdb4-f238-48ed-a0be-d3f895c72868\") " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.117730 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-logs" (OuterVolumeSpecName: "logs") pod "d9a2cdb4-f238-48ed-a0be-d3f895c72868" (UID: "d9a2cdb4-f238-48ed-a0be-d3f895c72868"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.118078 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.118131 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9a2cdb4-f238-48ed-a0be-d3f895c72868-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.123052 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-scripts" (OuterVolumeSpecName: "scripts") pod "d9a2cdb4-f238-48ed-a0be-d3f895c72868" (UID: "d9a2cdb4-f238-48ed-a0be-d3f895c72868"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.123174 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a2cdb4-f238-48ed-a0be-d3f895c72868-kube-api-access-mbpq6" (OuterVolumeSpecName: "kube-api-access-mbpq6") pod "d9a2cdb4-f238-48ed-a0be-d3f895c72868" (UID: "d9a2cdb4-f238-48ed-a0be-d3f895c72868"). InnerVolumeSpecName "kube-api-access-mbpq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.127219 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d9a2cdb4-f238-48ed-a0be-d3f895c72868" (UID: "d9a2cdb4-f238-48ed-a0be-d3f895c72868"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.156031 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9a2cdb4-f238-48ed-a0be-d3f895c72868" (UID: "d9a2cdb4-f238-48ed-a0be-d3f895c72868"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.170210 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-config-data" (OuterVolumeSpecName: "config-data") pod "d9a2cdb4-f238-48ed-a0be-d3f895c72868" (UID: "d9a2cdb4-f238-48ed-a0be-d3f895c72868"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.172389 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d9a2cdb4-f238-48ed-a0be-d3f895c72868" (UID: "d9a2cdb4-f238-48ed-a0be-d3f895c72868"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.219457 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.219498 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.219508 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.219517 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbpq6\" (UniqueName: \"kubernetes.io/projected/d9a2cdb4-f238-48ed-a0be-d3f895c72868-kube-api-access-mbpq6\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.219526 4740 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.219535 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a2cdb4-f238-48ed-a0be-d3f895c72868-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.236155 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.321518 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.603039 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2cdb4-f238-48ed-a0be-d3f895c72868","Type":"ContainerDied","Data":"6ec0487084ba176f73a61d7af46f68742d6e6a78276d432db6f1e5b46fa1bedf"} Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.603080 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: E1009 10:44:59.606173 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-ktvhb" podUID="71a8fb50-724c-4b07-83e2-71d8ee90cb05" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.688100 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.699999 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.705151 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:44:59 crc kubenswrapper[4740]: E1009 10:44:59.705532 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerName="glance-log" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.705544 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerName="glance-log" Oct 09 10:44:59 crc kubenswrapper[4740]: E1009 10:44:59.705559 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerName="glance-httpd" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.705565 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerName="glance-httpd" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.705734 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerName="glance-httpd" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.705768 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" containerName="glance-log" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.706645 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.708070 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.708172 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.713679 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.763729 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a2cdb4-f238-48ed-a0be-d3f895c72868" path="/var/lib/kubelet/pods/d9a2cdb4-f238-48ed-a0be-d3f895c72868/volumes" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.832309 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.832711 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.832821 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.832911 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-logs\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.832970 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvmv9\" (UniqueName: \"kubernetes.io/projected/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-kube-api-access-jvmv9\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.833054 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.833077 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.833100 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.934707 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935227 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-logs\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935280 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvmv9\" (UniqueName: \"kubernetes.io/projected/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-kube-api-access-jvmv9\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935345 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935379 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935399 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935474 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935525 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935798 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-logs\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.935931 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.939815 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.944533 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.944688 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.944911 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.947826 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.964415 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvmv9\" (UniqueName: \"kubernetes.io/projected/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-kube-api-access-jvmv9\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:44:59 crc kubenswrapper[4740]: I1009 10:44:59.968529 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.028021 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.151238 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q"] Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.152526 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.156929 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.157579 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.159475 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q"] Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.243865 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czqmn\" (UniqueName: \"kubernetes.io/projected/c2692046-9849-4c7f-a506-5767b57dcc85-kube-api-access-czqmn\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.243934 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2692046-9849-4c7f-a506-5767b57dcc85-config-volume\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.244386 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2692046-9849-4c7f-a506-5767b57dcc85-secret-volume\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.346182 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2692046-9849-4c7f-a506-5767b57dcc85-config-volume\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.346314 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2692046-9849-4c7f-a506-5767b57dcc85-secret-volume\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.346355 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czqmn\" (UniqueName: \"kubernetes.io/projected/c2692046-9849-4c7f-a506-5767b57dcc85-kube-api-access-czqmn\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.347157 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2692046-9849-4c7f-a506-5767b57dcc85-config-volume\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.354088 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2692046-9849-4c7f-a506-5767b57dcc85-secret-volume\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.368913 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czqmn\" (UniqueName: \"kubernetes.io/projected/c2692046-9849-4c7f-a506-5767b57dcc85-kube-api-access-czqmn\") pod \"collect-profiles-29333445-lhw2q\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:00 crc kubenswrapper[4740]: I1009 10:45:00.478333 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:04 crc kubenswrapper[4740]: I1009 10:45:04.474502 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Oct 09 10:45:04 crc kubenswrapper[4740]: I1009 10:45:04.475246 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.386784 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.421674 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.433484 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.437371 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527121 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-svc\") pod \"d40ec31e-4a50-4dae-a2b7-e48354125946\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527170 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-scripts\") pod \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527205 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-swift-storage-0\") pod \"d40ec31e-4a50-4dae-a2b7-e48354125946\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527230 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71cf9683-bcf4-4367-8365-08ef2fbe73d5-horizon-secret-key\") pod \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527255 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-config-data\") pod \"91408384-50b1-4bf9-9b73-3e82a64d73d2\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527289 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-sb\") pod \"d40ec31e-4a50-4dae-a2b7-e48354125946\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527327 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6176777d-0028-4420-a29c-cbf0b361c378-logs\") pod \"6176777d-0028-4420-a29c-cbf0b361c378\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527356 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-config\") pod \"d40ec31e-4a50-4dae-a2b7-e48354125946\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527377 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52cfx\" (UniqueName: \"kubernetes.io/projected/91408384-50b1-4bf9-9b73-3e82a64d73d2-kube-api-access-52cfx\") pod \"91408384-50b1-4bf9-9b73-3e82a64d73d2\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527397 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l582g\" (UniqueName: \"kubernetes.io/projected/6176777d-0028-4420-a29c-cbf0b361c378-kube-api-access-l582g\") pod \"6176777d-0028-4420-a29c-cbf0b361c378\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527448 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v28sx\" (UniqueName: \"kubernetes.io/projected/d40ec31e-4a50-4dae-a2b7-e48354125946-kube-api-access-v28sx\") pod \"d40ec31e-4a50-4dae-a2b7-e48354125946\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527470 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-nb\") pod \"d40ec31e-4a50-4dae-a2b7-e48354125946\" (UID: \"d40ec31e-4a50-4dae-a2b7-e48354125946\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527485 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91408384-50b1-4bf9-9b73-3e82a64d73d2-logs\") pod \"91408384-50b1-4bf9-9b73-3e82a64d73d2\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527500 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-config-data\") pod \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527529 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-scripts\") pod \"91408384-50b1-4bf9-9b73-3e82a64d73d2\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527547 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js2kx\" (UniqueName: \"kubernetes.io/projected/71cf9683-bcf4-4367-8365-08ef2fbe73d5-kube-api-access-js2kx\") pod \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527571 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-config-data\") pod \"6176777d-0028-4420-a29c-cbf0b361c378\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527608 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-scripts\") pod \"6176777d-0028-4420-a29c-cbf0b361c378\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527624 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6176777d-0028-4420-a29c-cbf0b361c378-horizon-secret-key\") pod \"6176777d-0028-4420-a29c-cbf0b361c378\" (UID: \"6176777d-0028-4420-a29c-cbf0b361c378\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527706 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf9683-bcf4-4367-8365-08ef2fbe73d5-logs\") pod \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\" (UID: \"71cf9683-bcf4-4367-8365-08ef2fbe73d5\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.527865 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6176777d-0028-4420-a29c-cbf0b361c378-logs" (OuterVolumeSpecName: "logs") pod "6176777d-0028-4420-a29c-cbf0b361c378" (UID: "6176777d-0028-4420-a29c-cbf0b361c378"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.528039 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91408384-50b1-4bf9-9b73-3e82a64d73d2-horizon-secret-key\") pod \"91408384-50b1-4bf9-9b73-3e82a64d73d2\" (UID: \"91408384-50b1-4bf9-9b73-3e82a64d73d2\") " Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.528367 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-scripts" (OuterVolumeSpecName: "scripts") pod "71cf9683-bcf4-4367-8365-08ef2fbe73d5" (UID: "71cf9683-bcf4-4367-8365-08ef2fbe73d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.528486 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-config-data" (OuterVolumeSpecName: "config-data") pod "91408384-50b1-4bf9-9b73-3e82a64d73d2" (UID: "91408384-50b1-4bf9-9b73-3e82a64d73d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.528605 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.528626 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.528639 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6176777d-0028-4420-a29c-cbf0b361c378-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.529058 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91408384-50b1-4bf9-9b73-3e82a64d73d2-logs" (OuterVolumeSpecName: "logs") pod "91408384-50b1-4bf9-9b73-3e82a64d73d2" (UID: "91408384-50b1-4bf9-9b73-3e82a64d73d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.529472 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-scripts" (OuterVolumeSpecName: "scripts") pod "6176777d-0028-4420-a29c-cbf0b361c378" (UID: "6176777d-0028-4420-a29c-cbf0b361c378"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.529563 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-config-data" (OuterVolumeSpecName: "config-data") pod "71cf9683-bcf4-4367-8365-08ef2fbe73d5" (UID: "71cf9683-bcf4-4367-8365-08ef2fbe73d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.536064 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71cf9683-bcf4-4367-8365-08ef2fbe73d5-logs" (OuterVolumeSpecName: "logs") pod "71cf9683-bcf4-4367-8365-08ef2fbe73d5" (UID: "71cf9683-bcf4-4367-8365-08ef2fbe73d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.536199 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-scripts" (OuterVolumeSpecName: "scripts") pod "91408384-50b1-4bf9-9b73-3e82a64d73d2" (UID: "91408384-50b1-4bf9-9b73-3e82a64d73d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.536341 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-config-data" (OuterVolumeSpecName: "config-data") pod "6176777d-0028-4420-a29c-cbf0b361c378" (UID: "6176777d-0028-4420-a29c-cbf0b361c378"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.536798 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cf9683-bcf4-4367-8365-08ef2fbe73d5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "71cf9683-bcf4-4367-8365-08ef2fbe73d5" (UID: "71cf9683-bcf4-4367-8365-08ef2fbe73d5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.536839 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91408384-50b1-4bf9-9b73-3e82a64d73d2-kube-api-access-52cfx" (OuterVolumeSpecName: "kube-api-access-52cfx") pod "91408384-50b1-4bf9-9b73-3e82a64d73d2" (UID: "91408384-50b1-4bf9-9b73-3e82a64d73d2"). InnerVolumeSpecName "kube-api-access-52cfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.536857 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cf9683-bcf4-4367-8365-08ef2fbe73d5-kube-api-access-js2kx" (OuterVolumeSpecName: "kube-api-access-js2kx") pod "71cf9683-bcf4-4367-8365-08ef2fbe73d5" (UID: "71cf9683-bcf4-4367-8365-08ef2fbe73d5"). InnerVolumeSpecName "kube-api-access-js2kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.538657 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6176777d-0028-4420-a29c-cbf0b361c378-kube-api-access-l582g" (OuterVolumeSpecName: "kube-api-access-l582g") pod "6176777d-0028-4420-a29c-cbf0b361c378" (UID: "6176777d-0028-4420-a29c-cbf0b361c378"). InnerVolumeSpecName "kube-api-access-l582g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.540043 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6176777d-0028-4420-a29c-cbf0b361c378-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6176777d-0028-4420-a29c-cbf0b361c378" (UID: "6176777d-0028-4420-a29c-cbf0b361c378"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.540342 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40ec31e-4a50-4dae-a2b7-e48354125946-kube-api-access-v28sx" (OuterVolumeSpecName: "kube-api-access-v28sx") pod "d40ec31e-4a50-4dae-a2b7-e48354125946" (UID: "d40ec31e-4a50-4dae-a2b7-e48354125946"). InnerVolumeSpecName "kube-api-access-v28sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.554639 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91408384-50b1-4bf9-9b73-3e82a64d73d2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "91408384-50b1-4bf9-9b73-3e82a64d73d2" (UID: "91408384-50b1-4bf9-9b73-3e82a64d73d2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.576796 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d40ec31e-4a50-4dae-a2b7-e48354125946" (UID: "d40ec31e-4a50-4dae-a2b7-e48354125946"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.591305 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d40ec31e-4a50-4dae-a2b7-e48354125946" (UID: "d40ec31e-4a50-4dae-a2b7-e48354125946"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.595109 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d40ec31e-4a50-4dae-a2b7-e48354125946" (UID: "d40ec31e-4a50-4dae-a2b7-e48354125946"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.600796 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-config" (OuterVolumeSpecName: "config") pod "d40ec31e-4a50-4dae-a2b7-e48354125946" (UID: "d40ec31e-4a50-4dae-a2b7-e48354125946"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.611218 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d40ec31e-4a50-4dae-a2b7-e48354125946" (UID: "d40ec31e-4a50-4dae-a2b7-e48354125946"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631387 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v28sx\" (UniqueName: \"kubernetes.io/projected/d40ec31e-4a50-4dae-a2b7-e48354125946-kube-api-access-v28sx\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631420 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631432 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91408384-50b1-4bf9-9b73-3e82a64d73d2-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631443 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71cf9683-bcf4-4367-8365-08ef2fbe73d5-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631456 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91408384-50b1-4bf9-9b73-3e82a64d73d2-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631467 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js2kx\" (UniqueName: \"kubernetes.io/projected/71cf9683-bcf4-4367-8365-08ef2fbe73d5-kube-api-access-js2kx\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631477 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631487 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6176777d-0028-4420-a29c-cbf0b361c378-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631497 4740 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6176777d-0028-4420-a29c-cbf0b361c378-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631507 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf9683-bcf4-4367-8365-08ef2fbe73d5-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631518 4740 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91408384-50b1-4bf9-9b73-3e82a64d73d2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631528 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631538 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631551 4740 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71cf9683-bcf4-4367-8365-08ef2fbe73d5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631561 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631572 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40ec31e-4a50-4dae-a2b7-e48354125946-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631581 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52cfx\" (UniqueName: \"kubernetes.io/projected/91408384-50b1-4bf9-9b73-3e82a64d73d2-kube-api-access-52cfx\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.631593 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l582g\" (UniqueName: \"kubernetes.io/projected/6176777d-0028-4420-a29c-cbf0b361c378-kube-api-access-l582g\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.663800 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-759d8b8899-gj54k" event={"ID":"91408384-50b1-4bf9-9b73-3e82a64d73d2","Type":"ContainerDied","Data":"7b8a1fea369826818f62652946db58cf12934c3ada704ae21c0526e57adf5770"} Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.663879 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-759d8b8899-gj54k" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.666191 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5584f4df97-wsq5t" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.666250 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5584f4df97-wsq5t" event={"ID":"71cf9683-bcf4-4367-8365-08ef2fbe73d5","Type":"ContainerDied","Data":"97a2a36b2293e28f4df238d885417a2bc40ede7c64b67d5200509b15d887edac"} Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.669389 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d99fcb759-vc4hb" event={"ID":"6176777d-0028-4420-a29c-cbf0b361c378","Type":"ContainerDied","Data":"9e158db5faa03782690e2afd15acd764430bb6ea776aa3780e58b01c7ab62ee6"} Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.669559 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d99fcb759-vc4hb" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.685039 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" event={"ID":"d40ec31e-4a50-4dae-a2b7-e48354125946","Type":"ContainerDied","Data":"ef480d5eabbdce7ee1ece0ccd6ba2c226f4b03b29c7eed76c825e35cb91fe083"} Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.686833 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.772993 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d99fcb759-vc4hb"] Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.778474 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d99fcb759-vc4hb"] Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.815501 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-759d8b8899-gj54k"] Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.827075 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-759d8b8899-gj54k"] Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.841893 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-66lt6"] Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.848556 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-66lt6"] Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.863390 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5584f4df97-wsq5t"] Oct 09 10:45:07 crc kubenswrapper[4740]: I1009 10:45:07.871167 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5584f4df97-wsq5t"] Oct 09 10:45:08 crc kubenswrapper[4740]: E1009 10:45:08.598875 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 09 10:45:08 crc kubenswrapper[4740]: E1009 10:45:08.599022 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5mb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-mw6z4_openstack(3062e734-0f07-4e8f-862e-a2906e7bbbd5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 10:45:08 crc kubenswrapper[4740]: E1009 10:45:08.600523 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-mw6z4" podUID="3062e734-0f07-4e8f-862e-a2906e7bbbd5" Oct 09 10:45:08 crc kubenswrapper[4740]: I1009 10:45:08.602347 4740 scope.go:117] "RemoveContainer" containerID="da0aa88afd1cb03c1ec2765072b38138de8e4e7997f33a81f9bafc5b025f0b70" Oct 09 10:45:08 crc kubenswrapper[4740]: E1009 10:45:08.739778 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-mw6z4" podUID="3062e734-0f07-4e8f-862e-a2906e7bbbd5" Oct 09 10:45:08 crc kubenswrapper[4740]: I1009 10:45:08.782337 4740 scope.go:117] "RemoveContainer" containerID="e17fae89ddc741aa737630e335cb189eb39baa06bacba265f8841f1c307cb442" Oct 09 10:45:08 crc kubenswrapper[4740]: I1009 10:45:08.894050 4740 scope.go:117] "RemoveContainer" containerID="e46f42a51bb24b608abd5dfea0bc4323aa6a816650f21f47036ac71ea4eebe4e" Oct 09 10:45:08 crc kubenswrapper[4740]: I1009 10:45:08.937140 4740 scope.go:117] "RemoveContainer" containerID="c941dd3f5c24549c433058e708d5c4d473c0c6306323e4dce440f351a88f93c0" Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.111514 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dd4b95776-lcxbt"] Oct 09 10:45:09 crc kubenswrapper[4740]: W1009 10:45:09.133088 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3762ae93_7451_4d99_aad4_f9c68666cf40.slice/crio-a32a5d3d3833cc32942ac9d3d04cf1f498ab291ba1fa5af7df4c2c64e88c09e9 WatchSource:0}: Error finding container a32a5d3d3833cc32942ac9d3d04cf1f498ab291ba1fa5af7df4c2c64e88c09e9: Status 404 returned error can't find the container with id a32a5d3d3833cc32942ac9d3d04cf1f498ab291ba1fa5af7df4c2c64e88c09e9 Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.210564 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wkc7b"] Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.232203 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f67cbf644-2n99k"] Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.248413 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c7a9-account-create-6np7n"] Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.298287 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:45:09 crc kubenswrapper[4740]: W1009 10:45:09.310246 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48d3bba1_3d4b_49eb_bd78_41e3e91267b5.slice/crio-aeb2ccd85c0c75886c0fd053d39886f26936c3bdc5d5e25cc1b09495de41561b WatchSource:0}: Error finding container aeb2ccd85c0c75886c0fd053d39886f26936c3bdc5d5e25cc1b09495de41561b: Status 404 returned error can't find the container with id aeb2ccd85c0c75886c0fd053d39886f26936c3bdc5d5e25cc1b09495de41561b Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.476934 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-66lt6" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.523266 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q"] Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.677433 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.746474 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dd4b95776-lcxbt" event={"ID":"3762ae93-7451-4d99-aad4-f9c68666cf40","Type":"ContainerStarted","Data":"a32a5d3d3833cc32942ac9d3d04cf1f498ab291ba1fa5af7df4c2c64e88c09e9"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.750011 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerStarted","Data":"c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.752333 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wkc7b" event={"ID":"4808b047-cb78-4910-8c22-65514e99c2cc","Type":"ContainerStarted","Data":"10035550e5c5e9edbc0f68f1e00132b0ba979f7e45b5a19fcc5c069c73e6b908"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.752405 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wkc7b" event={"ID":"4808b047-cb78-4910-8c22-65514e99c2cc","Type":"ContainerStarted","Data":"83a108c794650a9c33993790ba732e32133ed0be71b1d02c6beb98ad165e0ef4"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.784728 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wkc7b" podStartSLOduration=27.784706443 podStartE2EDuration="27.784706443s" podCreationTimestamp="2025-10-09 10:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:09.769953633 +0000 UTC m=+1048.732154024" watchObservedRunningTime="2025-10-09 10:45:09.784706443 +0000 UTC m=+1048.746906824" Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.808056 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6176777d-0028-4420-a29c-cbf0b361c378" path="/var/lib/kubelet/pods/6176777d-0028-4420-a29c-cbf0b361c378/volumes" Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.808603 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71cf9683-bcf4-4367-8365-08ef2fbe73d5" path="/var/lib/kubelet/pods/71cf9683-bcf4-4367-8365-08ef2fbe73d5/volumes" Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.809312 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91408384-50b1-4bf9-9b73-3e82a64d73d2" path="/var/lib/kubelet/pods/91408384-50b1-4bf9-9b73-3e82a64d73d2/volumes" Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.809816 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" path="/var/lib/kubelet/pods/d40ec31e-4a50-4dae-a2b7-e48354125946/volumes" Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.811248 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31b6faa7-7a5d-47ba-8ee8-08866ee2933e","Type":"ContainerStarted","Data":"c7c90232c3159a0e2aae51084e4ea6c15871fa59f7bc2cd20505cf60711548d2"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.824850 4740 generic.go:334] "Generic (PLEG): container finished" podID="5caee6ca-48fd-48a5-b84c-81d04b03a650" containerID="b8ab9aba5a57457832fd6b9e6c262c311855d895d6d4aecc56f1305481ffa62e" exitCode=0 Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.824943 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7a9-account-create-6np7n" event={"ID":"5caee6ca-48fd-48a5-b84c-81d04b03a650","Type":"ContainerDied","Data":"b8ab9aba5a57457832fd6b9e6c262c311855d895d6d4aecc56f1305481ffa62e"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.824973 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7a9-account-create-6np7n" event={"ID":"5caee6ca-48fd-48a5-b84c-81d04b03a650","Type":"ContainerStarted","Data":"86a7c97e22b902be7d32dcdaf42ea007b730afcdc88b0ef524d12bc448a7e79f"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.835242 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzzbk" event={"ID":"be99ba98-fb4b-4609-986e-3636a4a8f244","Type":"ContainerStarted","Data":"e83d5bb2dc4d7b48caa48225f8f2e9b1f3a576acb8ab58452fb925c2655a5947"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.836537 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" event={"ID":"c2692046-9849-4c7f-a506-5767b57dcc85","Type":"ContainerStarted","Data":"a87cbdfc26e23ef548bde9ee94325845534c5a3bd09a087fc52a7f1d30ce6c9d"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.862780 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dzzbk" podStartSLOduration=5.048056208 podStartE2EDuration="35.862746257s" podCreationTimestamp="2025-10-09 10:44:34 +0000 UTC" firstStartedPulling="2025-10-09 10:44:36.42734176 +0000 UTC m=+1015.389542141" lastFinishedPulling="2025-10-09 10:45:07.242031809 +0000 UTC m=+1046.204232190" observedRunningTime="2025-10-09 10:45:09.861267797 +0000 UTC m=+1048.823468178" watchObservedRunningTime="2025-10-09 10:45:09.862746257 +0000 UTC m=+1048.824946638" Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.888005 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48d3bba1-3d4b-49eb-bd78-41e3e91267b5","Type":"ContainerStarted","Data":"aeb2ccd85c0c75886c0fd053d39886f26936c3bdc5d5e25cc1b09495de41561b"} Oct 09 10:45:09 crc kubenswrapper[4740]: I1009 10:45:09.896919 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f67cbf644-2n99k" event={"ID":"5d46647a-6230-4561-bd21-a433ed55dad2","Type":"ContainerStarted","Data":"fee3d95f7ac812514644349f20350b4d2e83230b04c919591291a3ab9d1cd1cd"} Oct 09 10:45:10 crc kubenswrapper[4740]: I1009 10:45:10.910425 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dd4b95776-lcxbt" event={"ID":"3762ae93-7451-4d99-aad4-f9c68666cf40","Type":"ContainerStarted","Data":"e74ee97c101a171ef8afbc67dad67db78d219836bb89991b2294c6cc6a0274f8"} Oct 09 10:45:10 crc kubenswrapper[4740]: I1009 10:45:10.910943 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dd4b95776-lcxbt" event={"ID":"3762ae93-7451-4d99-aad4-f9c68666cf40","Type":"ContainerStarted","Data":"ce0d3b4377608c27a4cd3a582db57acdb0139fd18112f6f716dd616445b19b6f"} Oct 09 10:45:10 crc kubenswrapper[4740]: I1009 10:45:10.914089 4740 generic.go:334] "Generic (PLEG): container finished" podID="c2692046-9849-4c7f-a506-5767b57dcc85" containerID="5b4fce3a7571db0590af42395ac520a1ac480a36e1744b04d075202d1a3f6fe3" exitCode=0 Oct 09 10:45:10 crc kubenswrapper[4740]: I1009 10:45:10.914151 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" event={"ID":"c2692046-9849-4c7f-a506-5767b57dcc85","Type":"ContainerDied","Data":"5b4fce3a7571db0590af42395ac520a1ac480a36e1744b04d075202d1a3f6fe3"} Oct 09 10:45:10 crc kubenswrapper[4740]: I1009 10:45:10.918711 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48d3bba1-3d4b-49eb-bd78-41e3e91267b5","Type":"ContainerStarted","Data":"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed"} Oct 09 10:45:10 crc kubenswrapper[4740]: I1009 10:45:10.921267 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31b6faa7-7a5d-47ba-8ee8-08866ee2933e","Type":"ContainerStarted","Data":"437be0aeabdf24344b8a91f29200dc317f18ebb4acb4cf84d5ff25e799b29027"} Oct 09 10:45:10 crc kubenswrapper[4740]: I1009 10:45:10.937179 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5dd4b95776-lcxbt" podStartSLOduration=27.368173038 podStartE2EDuration="27.93716072s" podCreationTimestamp="2025-10-09 10:44:43 +0000 UTC" firstStartedPulling="2025-10-09 10:45:09.136541866 +0000 UTC m=+1048.098742237" lastFinishedPulling="2025-10-09 10:45:09.705529538 +0000 UTC m=+1048.667729919" observedRunningTime="2025-10-09 10:45:10.931134246 +0000 UTC m=+1049.893334627" watchObservedRunningTime="2025-10-09 10:45:10.93716072 +0000 UTC m=+1049.899361101" Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.237421 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7a9-account-create-6np7n" Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.323436 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5rg5\" (UniqueName: \"kubernetes.io/projected/5caee6ca-48fd-48a5-b84c-81d04b03a650-kube-api-access-k5rg5\") pod \"5caee6ca-48fd-48a5-b84c-81d04b03a650\" (UID: \"5caee6ca-48fd-48a5-b84c-81d04b03a650\") " Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.328987 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5caee6ca-48fd-48a5-b84c-81d04b03a650-kube-api-access-k5rg5" (OuterVolumeSpecName: "kube-api-access-k5rg5") pod "5caee6ca-48fd-48a5-b84c-81d04b03a650" (UID: "5caee6ca-48fd-48a5-b84c-81d04b03a650"). InnerVolumeSpecName "kube-api-access-k5rg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.425607 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5rg5\" (UniqueName: \"kubernetes.io/projected/5caee6ca-48fd-48a5-b84c-81d04b03a650-kube-api-access-k5rg5\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.930892 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerStarted","Data":"d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13"} Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.933669 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerName="glance-log" containerID="cri-o://13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed" gracePeriod=30 Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.933951 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48d3bba1-3d4b-49eb-bd78-41e3e91267b5","Type":"ContainerStarted","Data":"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e"} Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.933991 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerName="glance-httpd" containerID="cri-o://e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e" gracePeriod=30 Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.936382 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7a9-account-create-6np7n" Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.936384 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7a9-account-create-6np7n" event={"ID":"5caee6ca-48fd-48a5-b84c-81d04b03a650","Type":"ContainerDied","Data":"86a7c97e22b902be7d32dcdaf42ea007b730afcdc88b0ef524d12bc448a7e79f"} Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.936514 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86a7c97e22b902be7d32dcdaf42ea007b730afcdc88b0ef524d12bc448a7e79f" Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.938030 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31b6faa7-7a5d-47ba-8ee8-08866ee2933e","Type":"ContainerStarted","Data":"7245c76c0a9620f25cb294bedb5646245c05a4c5c47ec0a715e2c39a30474bed"} Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.940396 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f67cbf644-2n99k" event={"ID":"5d46647a-6230-4561-bd21-a433ed55dad2","Type":"ContainerStarted","Data":"bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416"} Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.940443 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f67cbf644-2n99k" event={"ID":"5d46647a-6230-4561-bd21-a433ed55dad2","Type":"ContainerStarted","Data":"f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a"} Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.942276 4740 generic.go:334] "Generic (PLEG): container finished" podID="be99ba98-fb4b-4609-986e-3636a4a8f244" containerID="e83d5bb2dc4d7b48caa48225f8f2e9b1f3a576acb8ab58452fb925c2655a5947" exitCode=0 Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.942351 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzzbk" event={"ID":"be99ba98-fb4b-4609-986e-3636a4a8f244","Type":"ContainerDied","Data":"e83d5bb2dc4d7b48caa48225f8f2e9b1f3a576acb8ab58452fb925c2655a5947"} Oct 09 10:45:11 crc kubenswrapper[4740]: I1009 10:45:11.963018 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=29.962997167 podStartE2EDuration="29.962997167s" podCreationTimestamp="2025-10-09 10:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:11.957523639 +0000 UTC m=+1050.919724020" watchObservedRunningTime="2025-10-09 10:45:11.962997167 +0000 UTC m=+1050.925197548" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.013718 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f67cbf644-2n99k" podStartSLOduration=28.124204237 podStartE2EDuration="29.01369644s" podCreationTimestamp="2025-10-09 10:44:43 +0000 UTC" firstStartedPulling="2025-10-09 10:45:09.233136943 +0000 UTC m=+1048.195337324" lastFinishedPulling="2025-10-09 10:45:10.122629136 +0000 UTC m=+1049.084829527" observedRunningTime="2025-10-09 10:45:12.010168275 +0000 UTC m=+1050.972368656" watchObservedRunningTime="2025-10-09 10:45:12.01369644 +0000 UTC m=+1050.975896821" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.015499 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.015490709 podStartE2EDuration="13.015490709s" podCreationTimestamp="2025-10-09 10:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:11.984304494 +0000 UTC m=+1050.946504875" watchObservedRunningTime="2025-10-09 10:45:12.015490709 +0000 UTC m=+1050.977691090" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.402415 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.444666 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2692046-9849-4c7f-a506-5767b57dcc85-secret-volume\") pod \"c2692046-9849-4c7f-a506-5767b57dcc85\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.444772 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czqmn\" (UniqueName: \"kubernetes.io/projected/c2692046-9849-4c7f-a506-5767b57dcc85-kube-api-access-czqmn\") pod \"c2692046-9849-4c7f-a506-5767b57dcc85\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.444847 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2692046-9849-4c7f-a506-5767b57dcc85-config-volume\") pod \"c2692046-9849-4c7f-a506-5767b57dcc85\" (UID: \"c2692046-9849-4c7f-a506-5767b57dcc85\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.445809 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2692046-9849-4c7f-a506-5767b57dcc85-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2692046-9849-4c7f-a506-5767b57dcc85" (UID: "c2692046-9849-4c7f-a506-5767b57dcc85"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.450976 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2692046-9849-4c7f-a506-5767b57dcc85-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2692046-9849-4c7f-a506-5767b57dcc85" (UID: "c2692046-9849-4c7f-a506-5767b57dcc85"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.451727 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2692046-9849-4c7f-a506-5767b57dcc85-kube-api-access-czqmn" (OuterVolumeSpecName: "kube-api-access-czqmn") pod "c2692046-9849-4c7f-a506-5767b57dcc85" (UID: "c2692046-9849-4c7f-a506-5767b57dcc85"). InnerVolumeSpecName "kube-api-access-czqmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.540936 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.546807 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2692046-9849-4c7f-a506-5767b57dcc85-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.546843 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czqmn\" (UniqueName: \"kubernetes.io/projected/c2692046-9849-4c7f-a506-5767b57dcc85-kube-api-access-czqmn\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.546886 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2692046-9849-4c7f-a506-5767b57dcc85-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.648401 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-scripts\") pod \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.648450 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-combined-ca-bundle\") pod \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.648517 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.648573 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-public-tls-certs\") pod \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.648607 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5mk9\" (UniqueName: \"kubernetes.io/projected/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-kube-api-access-s5mk9\") pod \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.648644 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-logs\") pod \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.648664 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-config-data\") pod \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.648720 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-httpd-run\") pod \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\" (UID: \"48d3bba1-3d4b-49eb-bd78-41e3e91267b5\") " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.649556 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "48d3bba1-3d4b-49eb-bd78-41e3e91267b5" (UID: "48d3bba1-3d4b-49eb-bd78-41e3e91267b5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.657546 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-logs" (OuterVolumeSpecName: "logs") pod "48d3bba1-3d4b-49eb-bd78-41e3e91267b5" (UID: "48d3bba1-3d4b-49eb-bd78-41e3e91267b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.663998 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "48d3bba1-3d4b-49eb-bd78-41e3e91267b5" (UID: "48d3bba1-3d4b-49eb-bd78-41e3e91267b5"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.666620 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-kube-api-access-s5mk9" (OuterVolumeSpecName: "kube-api-access-s5mk9") pod "48d3bba1-3d4b-49eb-bd78-41e3e91267b5" (UID: "48d3bba1-3d4b-49eb-bd78-41e3e91267b5"). InnerVolumeSpecName "kube-api-access-s5mk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.672933 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-scripts" (OuterVolumeSpecName: "scripts") pod "48d3bba1-3d4b-49eb-bd78-41e3e91267b5" (UID: "48d3bba1-3d4b-49eb-bd78-41e3e91267b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.692982 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48d3bba1-3d4b-49eb-bd78-41e3e91267b5" (UID: "48d3bba1-3d4b-49eb-bd78-41e3e91267b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.694950 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "48d3bba1-3d4b-49eb-bd78-41e3e91267b5" (UID: "48d3bba1-3d4b-49eb-bd78-41e3e91267b5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.719587 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-config-data" (OuterVolumeSpecName: "config-data") pod "48d3bba1-3d4b-49eb-bd78-41e3e91267b5" (UID: "48d3bba1-3d4b-49eb-bd78-41e3e91267b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.750467 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.750634 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5mk9\" (UniqueName: \"kubernetes.io/projected/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-kube-api-access-s5mk9\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.750647 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.750656 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.750664 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.750672 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.750680 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d3bba1-3d4b-49eb-bd78-41e3e91267b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.750714 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.774274 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.851835 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.964884 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.964884 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q" event={"ID":"c2692046-9849-4c7f-a506-5767b57dcc85","Type":"ContainerDied","Data":"a87cbdfc26e23ef548bde9ee94325845534c5a3bd09a087fc52a7f1d30ce6c9d"} Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.965388 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a87cbdfc26e23ef548bde9ee94325845534c5a3bd09a087fc52a7f1d30ce6c9d" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.971982 4740 generic.go:334] "Generic (PLEG): container finished" podID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerID="e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e" exitCode=0 Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.972011 4740 generic.go:334] "Generic (PLEG): container finished" podID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerID="13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed" exitCode=143 Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.972959 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.974541 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48d3bba1-3d4b-49eb-bd78-41e3e91267b5","Type":"ContainerDied","Data":"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e"} Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.974571 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48d3bba1-3d4b-49eb-bd78-41e3e91267b5","Type":"ContainerDied","Data":"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed"} Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.974583 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48d3bba1-3d4b-49eb-bd78-41e3e91267b5","Type":"ContainerDied","Data":"aeb2ccd85c0c75886c0fd053d39886f26936c3bdc5d5e25cc1b09495de41561b"} Oct 09 10:45:12 crc kubenswrapper[4740]: I1009 10:45:12.974598 4740 scope.go:117] "RemoveContainer" containerID="e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.017805 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.041309 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.050213 4740 scope.go:117] "RemoveContainer" containerID="13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054089 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:45:13 crc kubenswrapper[4740]: E1009 10:45:13.054506 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2692046-9849-4c7f-a506-5767b57dcc85" containerName="collect-profiles" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054529 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2692046-9849-4c7f-a506-5767b57dcc85" containerName="collect-profiles" Oct 09 10:45:13 crc kubenswrapper[4740]: E1009 10:45:13.054551 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5caee6ca-48fd-48a5-b84c-81d04b03a650" containerName="mariadb-account-create" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054561 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5caee6ca-48fd-48a5-b84c-81d04b03a650" containerName="mariadb-account-create" Oct 09 10:45:13 crc kubenswrapper[4740]: E1009 10:45:13.054581 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="init" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054590 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="init" Oct 09 10:45:13 crc kubenswrapper[4740]: E1009 10:45:13.054610 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerName="glance-httpd" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054619 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerName="glance-httpd" Oct 09 10:45:13 crc kubenswrapper[4740]: E1009 10:45:13.054648 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerName="glance-log" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054656 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerName="glance-log" Oct 09 10:45:13 crc kubenswrapper[4740]: E1009 10:45:13.054671 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="dnsmasq-dns" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054678 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="dnsmasq-dns" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054924 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2692046-9849-4c7f-a506-5767b57dcc85" containerName="collect-profiles" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054942 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5caee6ca-48fd-48a5-b84c-81d04b03a650" containerName="mariadb-account-create" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054958 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40ec31e-4a50-4dae-a2b7-e48354125946" containerName="dnsmasq-dns" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054967 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerName="glance-httpd" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.054989 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" containerName="glance-log" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.056344 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.069860 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.081475 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.114345 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.135986 4740 scope.go:117] "RemoveContainer" containerID="e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e" Oct 09 10:45:13 crc kubenswrapper[4740]: E1009 10:45:13.140473 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e\": container with ID starting with e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e not found: ID does not exist" containerID="e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.140576 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e"} err="failed to get container status \"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e\": rpc error: code = NotFound desc = could not find container \"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e\": container with ID starting with e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e not found: ID does not exist" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.140610 4740 scope.go:117] "RemoveContainer" containerID="13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed" Oct 09 10:45:13 crc kubenswrapper[4740]: E1009 10:45:13.141383 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed\": container with ID starting with 13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed not found: ID does not exist" containerID="13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.141438 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed"} err="failed to get container status \"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed\": rpc error: code = NotFound desc = could not find container \"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed\": container with ID starting with 13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed not found: ID does not exist" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.141490 4740 scope.go:117] "RemoveContainer" containerID="e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.142964 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e"} err="failed to get container status \"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e\": rpc error: code = NotFound desc = could not find container \"e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e\": container with ID starting with e0a01d30e0d74074f71f3b9fc7e048f899077313fe1ea5c3e083b1e69e9ef78e not found: ID does not exist" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.142984 4740 scope.go:117] "RemoveContainer" containerID="13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.148915 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed"} err="failed to get container status \"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed\": rpc error: code = NotFound desc = could not find container \"13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed\": container with ID starting with 13770bb02a1b622fdea0afbccadf2dcc3a0941fa6e61d9519a879eda24bae7ed not found: ID does not exist" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.181008 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-f8w6m"] Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.182591 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.186699 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-f8w6m"] Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.188704 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.188928 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.189049 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-shv4j" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.258776 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rshk\" (UniqueName: \"kubernetes.io/projected/09308063-0c8c-4f0a-83f5-779364607b38-kube-api-access-8rshk\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.258813 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-logs\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.258830 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-scripts\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.258846 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.258888 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.258910 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.258931 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.258945 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-config-data\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362395 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rshk\" (UniqueName: \"kubernetes.io/projected/09308063-0c8c-4f0a-83f5-779364607b38-kube-api-access-8rshk\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362456 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-scripts\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362481 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-logs\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362506 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362557 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-config\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362582 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-combined-ca-bundle\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362621 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362657 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362686 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362708 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-config-data\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.362794 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxkjc\" (UniqueName: \"kubernetes.io/projected/fb96e05e-80fb-4eec-b609-123ed43152ae-kube-api-access-jxkjc\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.364840 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.365532 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.365604 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-logs\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.374475 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-scripts\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.375930 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.386254 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.402103 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-config-data\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.407604 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rshk\" (UniqueName: \"kubernetes.io/projected/09308063-0c8c-4f0a-83f5-779364607b38-kube-api-access-8rshk\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.427301 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.464016 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkjc\" (UniqueName: \"kubernetes.io/projected/fb96e05e-80fb-4eec-b609-123ed43152ae-kube-api-access-jxkjc\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.464148 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-config\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.464172 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-combined-ca-bundle\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.471430 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-config\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.471822 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-combined-ca-bundle\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.480986 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzzbk" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.490772 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxkjc\" (UniqueName: \"kubernetes.io/projected/fb96e05e-80fb-4eec-b609-123ed43152ae-kube-api-access-jxkjc\") pod \"neutron-db-sync-f8w6m\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.513252 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.516506 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.518898 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.622782 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.622821 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.669410 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-combined-ca-bundle\") pod \"be99ba98-fb4b-4609-986e-3636a4a8f244\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.669470 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-scripts\") pod \"be99ba98-fb4b-4609-986e-3636a4a8f244\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.669491 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ba98-fb4b-4609-986e-3636a4a8f244-logs\") pod \"be99ba98-fb4b-4609-986e-3636a4a8f244\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.669583 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-config-data\") pod \"be99ba98-fb4b-4609-986e-3636a4a8f244\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.669620 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5rqm\" (UniqueName: \"kubernetes.io/projected/be99ba98-fb4b-4609-986e-3636a4a8f244-kube-api-access-m5rqm\") pod \"be99ba98-fb4b-4609-986e-3636a4a8f244\" (UID: \"be99ba98-fb4b-4609-986e-3636a4a8f244\") " Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.670549 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be99ba98-fb4b-4609-986e-3636a4a8f244-logs" (OuterVolumeSpecName: "logs") pod "be99ba98-fb4b-4609-986e-3636a4a8f244" (UID: "be99ba98-fb4b-4609-986e-3636a4a8f244"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.674390 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-scripts" (OuterVolumeSpecName: "scripts") pod "be99ba98-fb4b-4609-986e-3636a4a8f244" (UID: "be99ba98-fb4b-4609-986e-3636a4a8f244"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.682913 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be99ba98-fb4b-4609-986e-3636a4a8f244-kube-api-access-m5rqm" (OuterVolumeSpecName: "kube-api-access-m5rqm") pod "be99ba98-fb4b-4609-986e-3636a4a8f244" (UID: "be99ba98-fb4b-4609-986e-3636a4a8f244"). InnerVolumeSpecName "kube-api-access-m5rqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.704951 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be99ba98-fb4b-4609-986e-3636a4a8f244" (UID: "be99ba98-fb4b-4609-986e-3636a4a8f244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.707908 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.723819 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-config-data" (OuterVolumeSpecName: "config-data") pod "be99ba98-fb4b-4609-986e-3636a4a8f244" (UID: "be99ba98-fb4b-4609-986e-3636a4a8f244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.783929 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.783955 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.783964 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ba98-fb4b-4609-986e-3636a4a8f244-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.783972 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ba98-fb4b-4609-986e-3636a4a8f244-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.783981 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5rqm\" (UniqueName: \"kubernetes.io/projected/be99ba98-fb4b-4609-986e-3636a4a8f244-kube-api-access-m5rqm\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.799940 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48d3bba1-3d4b-49eb-bd78-41e3e91267b5" path="/var/lib/kubelet/pods/48d3bba1-3d4b-49eb-bd78-41e3e91267b5/volumes" Oct 09 10:45:13 crc kubenswrapper[4740]: I1009 10:45:13.801256 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-f8w6m"] Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.023061 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ktvhb" event={"ID":"71a8fb50-724c-4b07-83e2-71d8ee90cb05","Type":"ContainerStarted","Data":"af9dc27255d65f1bf502f880090b93e80c3869a57ba2a8c0d714eb227ee24b90"} Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.031294 4740 generic.go:334] "Generic (PLEG): container finished" podID="4808b047-cb78-4910-8c22-65514e99c2cc" containerID="10035550e5c5e9edbc0f68f1e00132b0ba979f7e45b5a19fcc5c069c73e6b908" exitCode=0 Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.031600 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wkc7b" event={"ID":"4808b047-cb78-4910-8c22-65514e99c2cc","Type":"ContainerDied","Data":"10035550e5c5e9edbc0f68f1e00132b0ba979f7e45b5a19fcc5c069c73e6b908"} Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.040566 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ktvhb" podStartSLOduration=2.276195092 podStartE2EDuration="37.040543431s" podCreationTimestamp="2025-10-09 10:44:37 +0000 UTC" firstStartedPulling="2025-10-09 10:44:38.94274706 +0000 UTC m=+1017.904947431" lastFinishedPulling="2025-10-09 10:45:13.707095399 +0000 UTC m=+1052.669295770" observedRunningTime="2025-10-09 10:45:14.039572345 +0000 UTC m=+1053.001772746" watchObservedRunningTime="2025-10-09 10:45:14.040543431 +0000 UTC m=+1053.002743812" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.048226 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzzbk" event={"ID":"be99ba98-fb4b-4609-986e-3636a4a8f244","Type":"ContainerDied","Data":"351af6b7bfbd24c41f6ec34d48be2c286a2e9c30690dfdf9e9c334aeff63b093"} Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.048281 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="351af6b7bfbd24c41f6ec34d48be2c286a2e9c30690dfdf9e9c334aeff63b093" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.048406 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzzbk" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.061639 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f8w6m" event={"ID":"fb96e05e-80fb-4eec-b609-123ed43152ae","Type":"ContainerStarted","Data":"a0e92c541c9f5d8bd40bdc61f0cf213dccde0034b31baed8a1e9cc213ebd1eaf"} Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.125699 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.171155 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55c77867db-hsc8q"] Oct 09 10:45:14 crc kubenswrapper[4740]: E1009 10:45:14.171586 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be99ba98-fb4b-4609-986e-3636a4a8f244" containerName="placement-db-sync" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.171603 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="be99ba98-fb4b-4609-986e-3636a4a8f244" containerName="placement-db-sync" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.171798 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="be99ba98-fb4b-4609-986e-3636a4a8f244" containerName="placement-db-sync" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.172786 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.177227 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.177367 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.177739 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.177844 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-47wgr" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.177844 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.185933 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55c77867db-hsc8q"] Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.196970 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-internal-tls-certs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.197066 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48968716-1198-429f-90f0-ab6663baaed5-logs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.197110 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdpk\" (UniqueName: \"kubernetes.io/projected/48968716-1198-429f-90f0-ab6663baaed5-kube-api-access-dvdpk\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.197162 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-config-data\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.197206 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-public-tls-certs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.197241 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-combined-ca-bundle\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.197376 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-scripts\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.298934 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-config-data\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.299004 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-public-tls-certs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.299031 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-combined-ca-bundle\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.299069 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-scripts\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.299169 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-internal-tls-certs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.299225 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48968716-1198-429f-90f0-ab6663baaed5-logs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.299258 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdpk\" (UniqueName: \"kubernetes.io/projected/48968716-1198-429f-90f0-ab6663baaed5-kube-api-access-dvdpk\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.299816 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48968716-1198-429f-90f0-ab6663baaed5-logs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.306238 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-public-tls-certs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.306417 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-internal-tls-certs\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.307278 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-combined-ca-bundle\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.307505 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-scripts\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.311648 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48968716-1198-429f-90f0-ab6663baaed5-config-data\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.318345 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvdpk\" (UniqueName: \"kubernetes.io/projected/48968716-1198-429f-90f0-ab6663baaed5-kube-api-access-dvdpk\") pod \"placement-55c77867db-hsc8q\" (UID: \"48968716-1198-429f-90f0-ab6663baaed5\") " pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:14 crc kubenswrapper[4740]: I1009 10:45:14.505295 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:18 crc kubenswrapper[4740]: W1009 10:45:18.743536 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09308063_0c8c_4f0a_83f5_779364607b38.slice/crio-2e5d27487f0f11d70d7c078105348de1f1755e5696b11f187ac77a8ceabc4ea7 WatchSource:0}: Error finding container 2e5d27487f0f11d70d7c078105348de1f1755e5696b11f187ac77a8ceabc4ea7: Status 404 returned error can't find the container with id 2e5d27487f0f11d70d7c078105348de1f1755e5696b11f187ac77a8ceabc4ea7 Oct 09 10:45:18 crc kubenswrapper[4740]: I1009 10:45:18.972555 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:45:18 crc kubenswrapper[4740]: I1009 10:45:18.989796 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-config-data\") pod \"4808b047-cb78-4910-8c22-65514e99c2cc\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " Oct 09 10:45:18 crc kubenswrapper[4740]: I1009 10:45:18.989887 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-credential-keys\") pod \"4808b047-cb78-4910-8c22-65514e99c2cc\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " Oct 09 10:45:18 crc kubenswrapper[4740]: I1009 10:45:18.989923 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-combined-ca-bundle\") pod \"4808b047-cb78-4910-8c22-65514e99c2cc\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " Oct 09 10:45:18 crc kubenswrapper[4740]: I1009 10:45:18.989964 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-fernet-keys\") pod \"4808b047-cb78-4910-8c22-65514e99c2cc\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " Oct 09 10:45:18 crc kubenswrapper[4740]: I1009 10:45:18.990078 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-scripts\") pod \"4808b047-cb78-4910-8c22-65514e99c2cc\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " Oct 09 10:45:18 crc kubenswrapper[4740]: I1009 10:45:18.990101 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgg5d\" (UniqueName: \"kubernetes.io/projected/4808b047-cb78-4910-8c22-65514e99c2cc-kube-api-access-pgg5d\") pod \"4808b047-cb78-4910-8c22-65514e99c2cc\" (UID: \"4808b047-cb78-4910-8c22-65514e99c2cc\") " Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:18.998436 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4808b047-cb78-4910-8c22-65514e99c2cc-kube-api-access-pgg5d" (OuterVolumeSpecName: "kube-api-access-pgg5d") pod "4808b047-cb78-4910-8c22-65514e99c2cc" (UID: "4808b047-cb78-4910-8c22-65514e99c2cc"). InnerVolumeSpecName "kube-api-access-pgg5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:18.998772 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-scripts" (OuterVolumeSpecName: "scripts") pod "4808b047-cb78-4910-8c22-65514e99c2cc" (UID: "4808b047-cb78-4910-8c22-65514e99c2cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.003229 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4808b047-cb78-4910-8c22-65514e99c2cc" (UID: "4808b047-cb78-4910-8c22-65514e99c2cc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.005155 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4808b047-cb78-4910-8c22-65514e99c2cc" (UID: "4808b047-cb78-4910-8c22-65514e99c2cc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.052503 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-config-data" (OuterVolumeSpecName: "config-data") pod "4808b047-cb78-4910-8c22-65514e99c2cc" (UID: "4808b047-cb78-4910-8c22-65514e99c2cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.070428 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4808b047-cb78-4910-8c22-65514e99c2cc" (UID: "4808b047-cb78-4910-8c22-65514e99c2cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.092244 4740 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.092274 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.092283 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgg5d\" (UniqueName: \"kubernetes.io/projected/4808b047-cb78-4910-8c22-65514e99c2cc-kube-api-access-pgg5d\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.092294 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.092302 4740 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.092312 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4808b047-cb78-4910-8c22-65514e99c2cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.166053 4740 generic.go:334] "Generic (PLEG): container finished" podID="71a8fb50-724c-4b07-83e2-71d8ee90cb05" containerID="af9dc27255d65f1bf502f880090b93e80c3869a57ba2a8c0d714eb227ee24b90" exitCode=0 Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.166422 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ktvhb" event={"ID":"71a8fb50-724c-4b07-83e2-71d8ee90cb05","Type":"ContainerDied","Data":"af9dc27255d65f1bf502f880090b93e80c3869a57ba2a8c0d714eb227ee24b90"} Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.171431 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wkc7b" event={"ID":"4808b047-cb78-4910-8c22-65514e99c2cc","Type":"ContainerDied","Data":"83a108c794650a9c33993790ba732e32133ed0be71b1d02c6beb98ad165e0ef4"} Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.171473 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a108c794650a9c33993790ba732e32133ed0be71b1d02c6beb98ad165e0ef4" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.171526 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wkc7b" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.175357 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09308063-0c8c-4f0a-83f5-779364607b38","Type":"ContainerStarted","Data":"2e5d27487f0f11d70d7c078105348de1f1755e5696b11f187ac77a8ceabc4ea7"} Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.177068 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f8w6m" event={"ID":"fb96e05e-80fb-4eec-b609-123ed43152ae","Type":"ContainerStarted","Data":"08a803f91f066820fd59c7e3e405a1c4934c77c751a2c0dc019586ab31cbf341"} Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.351209 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-f8w6m" podStartSLOduration=6.351187842 podStartE2EDuration="6.351187842s" podCreationTimestamp="2025-10-09 10:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:19.206530154 +0000 UTC m=+1058.168730535" watchObservedRunningTime="2025-10-09 10:45:19.351187842 +0000 UTC m=+1058.313388223" Oct 09 10:45:19 crc kubenswrapper[4740]: I1009 10:45:19.354674 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55c77867db-hsc8q"] Oct 09 10:45:19 crc kubenswrapper[4740]: W1009 10:45:19.358022 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48968716_1198_429f_90f0_ab6663baaed5.slice/crio-7ec81524a25fdf61b3ac3d3ae5a1fe929456ff3fa7c7f1dab70b54b15a96cc6b WatchSource:0}: Error finding container 7ec81524a25fdf61b3ac3d3ae5a1fe929456ff3fa7c7f1dab70b54b15a96cc6b: Status 404 returned error can't find the container with id 7ec81524a25fdf61b3ac3d3ae5a1fe929456ff3fa7c7f1dab70b54b15a96cc6b Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.029575 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.029903 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.067965 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5db569f5cf-ksc2p"] Oct 09 10:45:20 crc kubenswrapper[4740]: E1009 10:45:20.068359 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4808b047-cb78-4910-8c22-65514e99c2cc" containerName="keystone-bootstrap" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.068379 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4808b047-cb78-4910-8c22-65514e99c2cc" containerName="keystone-bootstrap" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.068598 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4808b047-cb78-4910-8c22-65514e99c2cc" containerName="keystone-bootstrap" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.069196 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.075258 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.075319 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.075474 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.075658 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.075871 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52ljb" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.080037 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.094296 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.097062 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5db569f5cf-ksc2p"] Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.115684 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.196700 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09308063-0c8c-4f0a-83f5-779364607b38","Type":"ContainerStarted","Data":"d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7"} Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.197075 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09308063-0c8c-4f0a-83f5-779364607b38","Type":"ContainerStarted","Data":"a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206"} Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.198700 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55c77867db-hsc8q" event={"ID":"48968716-1198-429f-90f0-ab6663baaed5","Type":"ContainerStarted","Data":"6f30765c9ef10e7c39b1d830c05d3c9d3da2aa39928fe5a49763341a6886be48"} Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.201336 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.201469 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.201527 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55c77867db-hsc8q" event={"ID":"48968716-1198-429f-90f0-ab6663baaed5","Type":"ContainerStarted","Data":"69d8d4dc48bb3ac56b3db5fb33c6831085d688a8c1fae2adc1ed4dd4742e4f98"} Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.201608 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55c77867db-hsc8q" event={"ID":"48968716-1198-429f-90f0-ab6663baaed5","Type":"ContainerStarted","Data":"7ec81524a25fdf61b3ac3d3ae5a1fe929456ff3fa7c7f1dab70b54b15a96cc6b"} Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.202004 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerStarted","Data":"4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425"} Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.203398 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.203473 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.212854 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-internal-tls-certs\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.212932 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7c5r\" (UniqueName: \"kubernetes.io/projected/d2497d66-a643-4eb4-b69d-725db422cb3a-kube-api-access-d7c5r\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.212959 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-fernet-keys\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.213003 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-credential-keys\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.213021 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-scripts\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.213071 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-config-data\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.213098 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-combined-ca-bundle\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.213116 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-public-tls-certs\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.233103 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.23308451 podStartE2EDuration="7.23308451s" podCreationTimestamp="2025-10-09 10:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:20.232584056 +0000 UTC m=+1059.194784447" watchObservedRunningTime="2025-10-09 10:45:20.23308451 +0000 UTC m=+1059.195284891" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.256805 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55c77867db-hsc8q" podStartSLOduration=6.256790732 podStartE2EDuration="6.256790732s" podCreationTimestamp="2025-10-09 10:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:20.256258068 +0000 UTC m=+1059.218458469" watchObservedRunningTime="2025-10-09 10:45:20.256790732 +0000 UTC m=+1059.218991113" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.314899 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-credential-keys\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.315897 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-scripts\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.316091 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-config-data\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.316207 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-combined-ca-bundle\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.316318 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-public-tls-certs\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.316571 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-internal-tls-certs\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.316698 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7c5r\" (UniqueName: \"kubernetes.io/projected/d2497d66-a643-4eb4-b69d-725db422cb3a-kube-api-access-d7c5r\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.316796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-fernet-keys\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.322096 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-config-data\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.322612 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-scripts\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.324200 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-public-tls-certs\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.324718 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-credential-keys\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.329176 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-internal-tls-certs\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.329622 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-combined-ca-bundle\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.332069 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2497d66-a643-4eb4-b69d-725db422cb3a-fernet-keys\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.345259 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7c5r\" (UniqueName: \"kubernetes.io/projected/d2497d66-a643-4eb4-b69d-725db422cb3a-kube-api-access-d7c5r\") pod \"keystone-5db569f5cf-ksc2p\" (UID: \"d2497d66-a643-4eb4-b69d-725db422cb3a\") " pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.400892 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.851233 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.931562 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-db-sync-config-data\") pod \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.931696 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-combined-ca-bundle\") pod \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.931740 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl4mj\" (UniqueName: \"kubernetes.io/projected/71a8fb50-724c-4b07-83e2-71d8ee90cb05-kube-api-access-sl4mj\") pod \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\" (UID: \"71a8fb50-724c-4b07-83e2-71d8ee90cb05\") " Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.936643 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a8fb50-724c-4b07-83e2-71d8ee90cb05-kube-api-access-sl4mj" (OuterVolumeSpecName: "kube-api-access-sl4mj") pod "71a8fb50-724c-4b07-83e2-71d8ee90cb05" (UID: "71a8fb50-724c-4b07-83e2-71d8ee90cb05"). InnerVolumeSpecName "kube-api-access-sl4mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.940363 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "71a8fb50-724c-4b07-83e2-71d8ee90cb05" (UID: "71a8fb50-724c-4b07-83e2-71d8ee90cb05"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:20 crc kubenswrapper[4740]: I1009 10:45:20.975872 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71a8fb50-724c-4b07-83e2-71d8ee90cb05" (UID: "71a8fb50-724c-4b07-83e2-71d8ee90cb05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.035667 4740 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.036039 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a8fb50-724c-4b07-83e2-71d8ee90cb05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.036053 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl4mj\" (UniqueName: \"kubernetes.io/projected/71a8fb50-724c-4b07-83e2-71d8ee90cb05-kube-api-access-sl4mj\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.173067 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5db569f5cf-ksc2p"] Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.219340 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ktvhb" event={"ID":"71a8fb50-724c-4b07-83e2-71d8ee90cb05","Type":"ContainerDied","Data":"0ace26903fb13a6d02f7c4d59441e3ad1ea0b954440ed07f16b30c0fa59f899f"} Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.219378 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ace26903fb13a6d02f7c4d59441e3ad1ea0b954440ed07f16b30c0fa59f899f" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.219433 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ktvhb" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.225792 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5db569f5cf-ksc2p" event={"ID":"d2497d66-a643-4eb4-b69d-725db422cb3a","Type":"ContainerStarted","Data":"1eb2336b044b481c85e23d852651cc1a4065f735cd5a08d32fd118c93bd83b47"} Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.465353 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-59d79d879-w5c9m"] Oct 09 10:45:21 crc kubenswrapper[4740]: E1009 10:45:21.465954 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a8fb50-724c-4b07-83e2-71d8ee90cb05" containerName="barbican-db-sync" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.465965 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a8fb50-724c-4b07-83e2-71d8ee90cb05" containerName="barbican-db-sync" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.466147 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a8fb50-724c-4b07-83e2-71d8ee90cb05" containerName="barbican-db-sync" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.467090 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.480730 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.480972 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-69htd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.481132 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.501647 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-59d79d879-w5c9m"] Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.512967 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5bd7855d54-lgqzd"] Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.514939 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.519316 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.544289 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-config-data\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.544335 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-config-data-custom\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.544360 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db36903-c2ef-429f-97dd-46e98c2a061b-logs\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.544463 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fkq9\" (UniqueName: \"kubernetes.io/projected/8db36903-c2ef-429f-97dd-46e98c2a061b-kube-api-access-2fkq9\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.544524 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-combined-ca-bundle\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.557739 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bd7855d54-lgqzd"] Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.625960 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-cxphd"] Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.627599 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648146 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc4hd\" (UniqueName: \"kubernetes.io/projected/72bcd07c-fbd9-44cb-8295-ba498f012009-kube-api-access-wc4hd\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648203 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-combined-ca-bundle\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648246 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-config-data-custom\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648270 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-config-data\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648293 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-config-data-custom\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648306 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db36903-c2ef-429f-97dd-46e98c2a061b-logs\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648329 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-config-data\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648351 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72bcd07c-fbd9-44cb-8295-ba498f012009-logs\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648412 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-combined-ca-bundle\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.648438 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fkq9\" (UniqueName: \"kubernetes.io/projected/8db36903-c2ef-429f-97dd-46e98c2a061b-kube-api-access-2fkq9\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.649059 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db36903-c2ef-429f-97dd-46e98c2a061b-logs\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.653476 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-config-data-custom\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.659051 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-config-data\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.665021 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db36903-c2ef-429f-97dd-46e98c2a061b-combined-ca-bundle\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.675807 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-cxphd"] Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.680402 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fkq9\" (UniqueName: \"kubernetes.io/projected/8db36903-c2ef-429f-97dd-46e98c2a061b-kube-api-access-2fkq9\") pod \"barbican-worker-59d79d879-w5c9m\" (UID: \"8db36903-c2ef-429f-97dd-46e98c2a061b\") " pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.692886 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d7f9c54-kwrl7"] Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.694331 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.702549 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.717499 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d7f9c54-kwrl7"] Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.749828 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-combined-ca-bundle\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.749890 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.749917 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc4hd\" (UniqueName: \"kubernetes.io/projected/72bcd07c-fbd9-44cb-8295-ba498f012009-kube-api-access-wc4hd\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.749949 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.749984 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-config-data-custom\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.749999 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgvn6\" (UniqueName: \"kubernetes.io/projected/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-kube-api-access-jgvn6\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750026 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdh4g\" (UniqueName: \"kubernetes.io/projected/ddf10f98-4990-4b57-b586-8c47fcf5993e-kube-api-access-tdh4g\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750050 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-config-data\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750072 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750089 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72bcd07c-fbd9-44cb-8295-ba498f012009-logs\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750108 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-combined-ca-bundle\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750130 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750150 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data-custom\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750172 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-config\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750200 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-logs\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.750217 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.754264 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72bcd07c-fbd9-44cb-8295-ba498f012009-logs\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.755445 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-combined-ca-bundle\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.757636 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-config-data-custom\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.757922 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72bcd07c-fbd9-44cb-8295-ba498f012009-config-data\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.782861 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc4hd\" (UniqueName: \"kubernetes.io/projected/72bcd07c-fbd9-44cb-8295-ba498f012009-kube-api-access-wc4hd\") pod \"barbican-keystone-listener-5bd7855d54-lgqzd\" (UID: \"72bcd07c-fbd9-44cb-8295-ba498f012009\") " pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.798049 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-59d79d879-w5c9m" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.851687 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.851793 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-combined-ca-bundle\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.851837 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.851859 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data-custom\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.851883 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-config\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.851925 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-logs\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.851943 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.851988 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.852047 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.852120 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgvn6\" (UniqueName: \"kubernetes.io/projected/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-kube-api-access-jgvn6\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.852147 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdh4g\" (UniqueName: \"kubernetes.io/projected/ddf10f98-4990-4b57-b586-8c47fcf5993e-kube-api-access-tdh4g\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.853227 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.855386 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.859979 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.860192 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-combined-ca-bundle\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.865459 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.866863 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-config\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.867063 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-logs\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.868476 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.869859 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.870181 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdh4g\" (UniqueName: \"kubernetes.io/projected/ddf10f98-4990-4b57-b586-8c47fcf5993e-kube-api-access-tdh4g\") pod \"dnsmasq-dns-59d5ff467f-cxphd\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.871588 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data-custom\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:21 crc kubenswrapper[4740]: I1009 10:45:21.881357 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgvn6\" (UniqueName: \"kubernetes.io/projected/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-kube-api-access-jgvn6\") pod \"barbican-api-7d7f9c54-kwrl7\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:21.992908 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:22.017720 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:22.261298 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5db569f5cf-ksc2p" event={"ID":"d2497d66-a643-4eb4-b69d-725db422cb3a","Type":"ContainerStarted","Data":"c4966e1107f8e5d4323bdd60d36d9f24db642adb902a659228dd60e8052641c9"} Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:22.261883 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:22.316452 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5db569f5cf-ksc2p" podStartSLOduration=2.316435103 podStartE2EDuration="2.316435103s" podCreationTimestamp="2025-10-09 10:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:22.291663472 +0000 UTC m=+1061.253863853" watchObservedRunningTime="2025-10-09 10:45:22.316435103 +0000 UTC m=+1061.278635484" Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:22.398422 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-59d79d879-w5c9m"] Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:22.559539 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bd7855d54-lgqzd"] Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:22.624447 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-cxphd"] Oct 09 10:45:22 crc kubenswrapper[4740]: I1009 10:45:22.666807 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d7f9c54-kwrl7"] Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.248367 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.248680 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.260936 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.278171 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mw6z4" event={"ID":"3062e734-0f07-4e8f-862e-a2906e7bbbd5","Type":"ContainerStarted","Data":"5b75be64d23dd84ced609d4b115b958cb2dfaf49a99ac0c3602d3d7777d78eb4"} Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.280100 4740 generic.go:334] "Generic (PLEG): container finished" podID="ddf10f98-4990-4b57-b586-8c47fcf5993e" containerID="006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d" exitCode=0 Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.280169 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" event={"ID":"ddf10f98-4990-4b57-b586-8c47fcf5993e","Type":"ContainerDied","Data":"006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d"} Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.280194 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" event={"ID":"ddf10f98-4990-4b57-b586-8c47fcf5993e","Type":"ContainerStarted","Data":"a8ba68c58dc3c05c9f5f2fef26fa36041eebb3a99df864396afb493b58878ff8"} Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.281988 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" event={"ID":"72bcd07c-fbd9-44cb-8295-ba498f012009","Type":"ContainerStarted","Data":"414287cd891ac27082bddd2472c79f80fe4618d9f2feab1f4c8a9948411749b3"} Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.286368 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59d79d879-w5c9m" event={"ID":"8db36903-c2ef-429f-97dd-46e98c2a061b","Type":"ContainerStarted","Data":"f59c4a0f897f9f9e607273cd269651b06d935cc04371e1beabec78f3d2a19579"} Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.314221 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7f9c54-kwrl7" event={"ID":"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc","Type":"ContainerStarted","Data":"69c62d42a35a352f169fbf0778bd61ee3bbb732992914b48d25dd7792b562edd"} Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.314264 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7f9c54-kwrl7" event={"ID":"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc","Type":"ContainerStarted","Data":"a98c21eed4d0e1bdbab9c4f4a2d8a3a69aaf2528ff6c4a613504a7a0684f43f2"} Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.314279 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7f9c54-kwrl7" event={"ID":"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc","Type":"ContainerStarted","Data":"e02cfe8410a4c220df7037af22d112534209ac23e44c34d130ee98f453660ada"} Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.314295 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.314309 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.333680 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mw6z4" podStartSLOduration=3.959884276 podStartE2EDuration="46.333658187s" podCreationTimestamp="2025-10-09 10:44:37 +0000 UTC" firstStartedPulling="2025-10-09 10:44:39.029586819 +0000 UTC m=+1017.991787200" lastFinishedPulling="2025-10-09 10:45:21.40336073 +0000 UTC m=+1060.365561111" observedRunningTime="2025-10-09 10:45:23.326991856 +0000 UTC m=+1062.289192237" watchObservedRunningTime="2025-10-09 10:45:23.333658187 +0000 UTC m=+1062.295858568" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.375311 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d7f9c54-kwrl7" podStartSLOduration=2.375286414 podStartE2EDuration="2.375286414s" podCreationTimestamp="2025-10-09 10:45:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:23.3651746 +0000 UTC m=+1062.327374981" watchObservedRunningTime="2025-10-09 10:45:23.375286414 +0000 UTC m=+1062.337486805" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.520051 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f67cbf644-2n99k" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.629887 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5dd4b95776-lcxbt" podUID="3762ae93-7451-4d99-aad4-f9c68666cf40" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.708819 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.708873 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.786466 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 10:45:23 crc kubenswrapper[4740]: I1009 10:45:23.791941 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.331936 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" event={"ID":"ddf10f98-4990-4b57-b586-8c47fcf5993e","Type":"ContainerStarted","Data":"94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77"} Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.333345 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.334233 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.366267 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" podStartSLOduration=3.366249446 podStartE2EDuration="3.366249446s" podCreationTimestamp="2025-10-09 10:45:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:24.361952269 +0000 UTC m=+1063.324152650" watchObservedRunningTime="2025-10-09 10:45:24.366249446 +0000 UTC m=+1063.328449827" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.433035 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-544757df48-b9dz7"] Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.434458 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.442829 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-544757df48-b9dz7"] Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.444947 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.445002 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.551608 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-public-tls-certs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.551664 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-config-data-custom\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.551689 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-combined-ca-bundle\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.551715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-config-data\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.551989 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9b7872-3887-45a9-8405-506862479e3f-logs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.552025 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qrtk\" (UniqueName: \"kubernetes.io/projected/dc9b7872-3887-45a9-8405-506862479e3f-kube-api-access-6qrtk\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.552099 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-internal-tls-certs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.653914 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9b7872-3887-45a9-8405-506862479e3f-logs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.653969 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qrtk\" (UniqueName: \"kubernetes.io/projected/dc9b7872-3887-45a9-8405-506862479e3f-kube-api-access-6qrtk\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.654015 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-internal-tls-certs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.654104 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-public-tls-certs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.654127 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-config-data-custom\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.654156 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-combined-ca-bundle\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.654179 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-config-data\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.655615 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9b7872-3887-45a9-8405-506862479e3f-logs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.659663 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-combined-ca-bundle\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.660533 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-internal-tls-certs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.666633 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-config-data-custom\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.669183 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-public-tls-certs\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.674274 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qrtk\" (UniqueName: \"kubernetes.io/projected/dc9b7872-3887-45a9-8405-506862479e3f-kube-api-access-6qrtk\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.675775 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9b7872-3887-45a9-8405-506862479e3f-config-data\") pod \"barbican-api-544757df48-b9dz7\" (UID: \"dc9b7872-3887-45a9-8405-506862479e3f\") " pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:24 crc kubenswrapper[4740]: I1009 10:45:24.764353 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:25 crc kubenswrapper[4740]: I1009 10:45:25.339399 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:26 crc kubenswrapper[4740]: I1009 10:45:26.348449 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:45:26 crc kubenswrapper[4740]: I1009 10:45:26.518371 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 10:45:27 crc kubenswrapper[4740]: I1009 10:45:27.070366 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 10:45:29 crc kubenswrapper[4740]: I1009 10:45:29.405397 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-544757df48-b9dz7"] Oct 09 10:45:29 crc kubenswrapper[4740]: W1009 10:45:29.593132 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc9b7872_3887_45a9_8405_506862479e3f.slice/crio-f0f0ba41f17432a14bdcb9589ef0d3c47ef720aa3ce8e8d825be21d1de5fa877 WatchSource:0}: Error finding container f0f0ba41f17432a14bdcb9589ef0d3c47ef720aa3ce8e8d825be21d1de5fa877: Status 404 returned error can't find the container with id f0f0ba41f17432a14bdcb9589ef0d3c47ef720aa3ce8e8d825be21d1de5fa877 Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.396578 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerStarted","Data":"5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6"} Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.397026 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.396777 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="sg-core" containerID="cri-o://4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425" gracePeriod=30 Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.396698 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="ceilometer-central-agent" containerID="cri-o://c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d" gracePeriod=30 Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.396820 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="proxy-httpd" containerID="cri-o://5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6" gracePeriod=30 Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.396821 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="ceilometer-notification-agent" containerID="cri-o://d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13" gracePeriod=30 Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.399069 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" event={"ID":"72bcd07c-fbd9-44cb-8295-ba498f012009","Type":"ContainerStarted","Data":"50489dc8a698c2c85de82583064494559bd9018766073e19e6d07b4f6d094291"} Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.399115 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" event={"ID":"72bcd07c-fbd9-44cb-8295-ba498f012009","Type":"ContainerStarted","Data":"c88568ca45337eb0f79d774a0dd7b4fbce1fee594bf1a98098034f9a31b06d0f"} Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.416013 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59d79d879-w5c9m" event={"ID":"8db36903-c2ef-429f-97dd-46e98c2a061b","Type":"ContainerStarted","Data":"a1bb586468a501b73ecc06cff72f8ce16e9ce7618ddc48c2417d071f4e147be0"} Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.416082 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59d79d879-w5c9m" event={"ID":"8db36903-c2ef-429f-97dd-46e98c2a061b","Type":"ContainerStarted","Data":"a26373c3096999753f70889dbbbcdf3bb820dfd89cb5a5eeaa0c5cba3f742005"} Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.425051 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.099936139 podStartE2EDuration="56.425036032s" podCreationTimestamp="2025-10-09 10:44:34 +0000 UTC" firstStartedPulling="2025-10-09 10:44:36.427437623 +0000 UTC m=+1015.389637994" lastFinishedPulling="2025-10-09 10:45:29.752537506 +0000 UTC m=+1068.714737887" observedRunningTime="2025-10-09 10:45:30.417624011 +0000 UTC m=+1069.379824422" watchObservedRunningTime="2025-10-09 10:45:30.425036032 +0000 UTC m=+1069.387236413" Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.429470 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-544757df48-b9dz7" event={"ID":"dc9b7872-3887-45a9-8405-506862479e3f","Type":"ContainerStarted","Data":"e2eb833e7e32cde063a2ccfe5a0e4ef5ced3dcf061825956fd2a133f523f8fba"} Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.429538 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-544757df48-b9dz7" event={"ID":"dc9b7872-3887-45a9-8405-506862479e3f","Type":"ContainerStarted","Data":"c25021347a1fbefad08f65ba37afd723b8f012a0183381f5b922cc679267faef"} Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.429559 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-544757df48-b9dz7" event={"ID":"dc9b7872-3887-45a9-8405-506862479e3f","Type":"ContainerStarted","Data":"f0f0ba41f17432a14bdcb9589ef0d3c47ef720aa3ce8e8d825be21d1de5fa877"} Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.430790 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.430844 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.449229 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5bd7855d54-lgqzd" podStartSLOduration=2.420519709 podStartE2EDuration="9.449206817s" podCreationTimestamp="2025-10-09 10:45:21 +0000 UTC" firstStartedPulling="2025-10-09 10:45:22.570188796 +0000 UTC m=+1061.532389167" lastFinishedPulling="2025-10-09 10:45:29.598875894 +0000 UTC m=+1068.561076275" observedRunningTime="2025-10-09 10:45:30.43455771 +0000 UTC m=+1069.396758111" watchObservedRunningTime="2025-10-09 10:45:30.449206817 +0000 UTC m=+1069.411407198" Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.460855 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-59d79d879-w5c9m" podStartSLOduration=2.32749898 podStartE2EDuration="9.460834592s" podCreationTimestamp="2025-10-09 10:45:21 +0000 UTC" firstStartedPulling="2025-10-09 10:45:22.458569593 +0000 UTC m=+1061.420769984" lastFinishedPulling="2025-10-09 10:45:29.591905215 +0000 UTC m=+1068.554105596" observedRunningTime="2025-10-09 10:45:30.45521699 +0000 UTC m=+1069.417417371" watchObservedRunningTime="2025-10-09 10:45:30.460834592 +0000 UTC m=+1069.423034973" Oct 09 10:45:30 crc kubenswrapper[4740]: I1009 10:45:30.527745 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-544757df48-b9dz7" podStartSLOduration=6.5277241440000005 podStartE2EDuration="6.527724144s" podCreationTimestamp="2025-10-09 10:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:30.479007754 +0000 UTC m=+1069.441208135" watchObservedRunningTime="2025-10-09 10:45:30.527724144 +0000 UTC m=+1069.489924525" Oct 09 10:45:31 crc kubenswrapper[4740]: I1009 10:45:31.439506 4740 generic.go:334] "Generic (PLEG): container finished" podID="1f91334a-239f-4459-b885-aa9865bc6a04" containerID="5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6" exitCode=0 Oct 09 10:45:31 crc kubenswrapper[4740]: I1009 10:45:31.439830 4740 generic.go:334] "Generic (PLEG): container finished" podID="1f91334a-239f-4459-b885-aa9865bc6a04" containerID="4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425" exitCode=2 Oct 09 10:45:31 crc kubenswrapper[4740]: I1009 10:45:31.439590 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerDied","Data":"5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6"} Oct 09 10:45:31 crc kubenswrapper[4740]: I1009 10:45:31.439874 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerDied","Data":"4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425"} Oct 09 10:45:31 crc kubenswrapper[4740]: I1009 10:45:31.439887 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerDied","Data":"c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d"} Oct 09 10:45:31 crc kubenswrapper[4740]: I1009 10:45:31.439842 4740 generic.go:334] "Generic (PLEG): container finished" podID="1f91334a-239f-4459-b885-aa9865bc6a04" containerID="c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d" exitCode=0 Oct 09 10:45:31 crc kubenswrapper[4740]: I1009 10:45:31.441960 4740 generic.go:334] "Generic (PLEG): container finished" podID="3062e734-0f07-4e8f-862e-a2906e7bbbd5" containerID="5b75be64d23dd84ced609d4b115b958cb2dfaf49a99ac0c3602d3d7777d78eb4" exitCode=0 Oct 09 10:45:31 crc kubenswrapper[4740]: I1009 10:45:31.442178 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mw6z4" event={"ID":"3062e734-0f07-4e8f-862e-a2906e7bbbd5","Type":"ContainerDied","Data":"5b75be64d23dd84ced609d4b115b958cb2dfaf49a99ac0c3602d3d7777d78eb4"} Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.019878 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.073283 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-tr9f9"] Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.073513 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" podUID="3e84b517-ac89-462b-baae-559a25766f7e" containerName="dnsmasq-dns" containerID="cri-o://6fa0c77057ccff8c499f02dc8114209b7aa046ee19a37dfaf9c5852a6a693f4a" gracePeriod=10 Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.469905 4740 generic.go:334] "Generic (PLEG): container finished" podID="3e84b517-ac89-462b-baae-559a25766f7e" containerID="6fa0c77057ccff8c499f02dc8114209b7aa046ee19a37dfaf9c5852a6a693f4a" exitCode=0 Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.473958 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" event={"ID":"3e84b517-ac89-462b-baae-559a25766f7e","Type":"ContainerDied","Data":"6fa0c77057ccff8c499f02dc8114209b7aa046ee19a37dfaf9c5852a6a693f4a"} Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.575597 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.618380 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pgrl\" (UniqueName: \"kubernetes.io/projected/3e84b517-ac89-462b-baae-559a25766f7e-kube-api-access-8pgrl\") pod \"3e84b517-ac89-462b-baae-559a25766f7e\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.618427 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-config\") pod \"3e84b517-ac89-462b-baae-559a25766f7e\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.618520 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-swift-storage-0\") pod \"3e84b517-ac89-462b-baae-559a25766f7e\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.618561 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-nb\") pod \"3e84b517-ac89-462b-baae-559a25766f7e\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.618653 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-svc\") pod \"3e84b517-ac89-462b-baae-559a25766f7e\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.618671 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-sb\") pod \"3e84b517-ac89-462b-baae-559a25766f7e\" (UID: \"3e84b517-ac89-462b-baae-559a25766f7e\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.648932 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e84b517-ac89-462b-baae-559a25766f7e-kube-api-access-8pgrl" (OuterVolumeSpecName: "kube-api-access-8pgrl") pod "3e84b517-ac89-462b-baae-559a25766f7e" (UID: "3e84b517-ac89-462b-baae-559a25766f7e"). InnerVolumeSpecName "kube-api-access-8pgrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.694479 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e84b517-ac89-462b-baae-559a25766f7e" (UID: "3e84b517-ac89-462b-baae-559a25766f7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.724841 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.725090 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pgrl\" (UniqueName: \"kubernetes.io/projected/3e84b517-ac89-462b-baae-559a25766f7e-kube-api-access-8pgrl\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.790420 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3e84b517-ac89-462b-baae-559a25766f7e" (UID: "3e84b517-ac89-462b-baae-559a25766f7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.790743 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3e84b517-ac89-462b-baae-559a25766f7e" (UID: "3e84b517-ac89-462b-baae-559a25766f7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.827002 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.827562 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.828300 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-config" (OuterVolumeSpecName: "config") pod "3e84b517-ac89-462b-baae-559a25766f7e" (UID: "3e84b517-ac89-462b-baae-559a25766f7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.850320 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3e84b517-ac89-462b-baae-559a25766f7e" (UID: "3e84b517-ac89-462b-baae-559a25766f7e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.880681 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.928210 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5mb7\" (UniqueName: \"kubernetes.io/projected/3062e734-0f07-4e8f-862e-a2906e7bbbd5-kube-api-access-h5mb7\") pod \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.928313 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-combined-ca-bundle\") pod \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.928372 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3062e734-0f07-4e8f-862e-a2906e7bbbd5-etc-machine-id\") pod \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.928467 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-scripts\") pod \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.928516 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-db-sync-config-data\") pod \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.928576 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-config-data\") pod \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\" (UID: \"3062e734-0f07-4e8f-862e-a2906e7bbbd5\") " Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.928862 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3062e734-0f07-4e8f-862e-a2906e7bbbd5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3062e734-0f07-4e8f-862e-a2906e7bbbd5" (UID: "3062e734-0f07-4e8f-862e-a2906e7bbbd5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.929059 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.929075 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e84b517-ac89-462b-baae-559a25766f7e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.929086 4740 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3062e734-0f07-4e8f-862e-a2906e7bbbd5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.932211 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3062e734-0f07-4e8f-862e-a2906e7bbbd5" (UID: "3062e734-0f07-4e8f-862e-a2906e7bbbd5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.932225 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3062e734-0f07-4e8f-862e-a2906e7bbbd5-kube-api-access-h5mb7" (OuterVolumeSpecName: "kube-api-access-h5mb7") pod "3062e734-0f07-4e8f-862e-a2906e7bbbd5" (UID: "3062e734-0f07-4e8f-862e-a2906e7bbbd5"). InnerVolumeSpecName "kube-api-access-h5mb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.932811 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-scripts" (OuterVolumeSpecName: "scripts") pod "3062e734-0f07-4e8f-862e-a2906e7bbbd5" (UID: "3062e734-0f07-4e8f-862e-a2906e7bbbd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.961879 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3062e734-0f07-4e8f-862e-a2906e7bbbd5" (UID: "3062e734-0f07-4e8f-862e-a2906e7bbbd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:32 crc kubenswrapper[4740]: I1009 10:45:32.983929 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-config-data" (OuterVolumeSpecName: "config-data") pod "3062e734-0f07-4e8f-862e-a2906e7bbbd5" (UID: "3062e734-0f07-4e8f-862e-a2906e7bbbd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.031620 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.031671 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.031688 4740 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.031704 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3062e734-0f07-4e8f-862e-a2906e7bbbd5-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.031719 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5mb7\" (UniqueName: \"kubernetes.io/projected/3062e734-0f07-4e8f-862e-a2906e7bbbd5-kube-api-access-h5mb7\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.380975 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.437588 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-sg-core-conf-yaml\") pod \"1f91334a-239f-4459-b885-aa9865bc6a04\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.438542 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-log-httpd\") pod \"1f91334a-239f-4459-b885-aa9865bc6a04\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.438646 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-config-data\") pod \"1f91334a-239f-4459-b885-aa9865bc6a04\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.438692 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-run-httpd\") pod \"1f91334a-239f-4459-b885-aa9865bc6a04\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.438741 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-scripts\") pod \"1f91334a-239f-4459-b885-aa9865bc6a04\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.438806 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x29x2\" (UniqueName: \"kubernetes.io/projected/1f91334a-239f-4459-b885-aa9865bc6a04-kube-api-access-x29x2\") pod \"1f91334a-239f-4459-b885-aa9865bc6a04\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.438834 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-combined-ca-bundle\") pod \"1f91334a-239f-4459-b885-aa9865bc6a04\" (UID: \"1f91334a-239f-4459-b885-aa9865bc6a04\") " Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.439210 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1f91334a-239f-4459-b885-aa9865bc6a04" (UID: "1f91334a-239f-4459-b885-aa9865bc6a04"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.439486 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.439716 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1f91334a-239f-4459-b885-aa9865bc6a04" (UID: "1f91334a-239f-4459-b885-aa9865bc6a04"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.445500 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f91334a-239f-4459-b885-aa9865bc6a04-kube-api-access-x29x2" (OuterVolumeSpecName: "kube-api-access-x29x2") pod "1f91334a-239f-4459-b885-aa9865bc6a04" (UID: "1f91334a-239f-4459-b885-aa9865bc6a04"). InnerVolumeSpecName "kube-api-access-x29x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.452465 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-scripts" (OuterVolumeSpecName: "scripts") pod "1f91334a-239f-4459-b885-aa9865bc6a04" (UID: "1f91334a-239f-4459-b885-aa9865bc6a04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.479398 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1f91334a-239f-4459-b885-aa9865bc6a04" (UID: "1f91334a-239f-4459-b885-aa9865bc6a04"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.505721 4740 generic.go:334] "Generic (PLEG): container finished" podID="1f91334a-239f-4459-b885-aa9865bc6a04" containerID="d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13" exitCode=0 Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.506072 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerDied","Data":"d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13"} Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.506178 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f91334a-239f-4459-b885-aa9865bc6a04","Type":"ContainerDied","Data":"f7181f6679bffc156b68aeaa09bd04438232222c522cd5250bb4a08904c37d01"} Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.506263 4740 scope.go:117] "RemoveContainer" containerID="5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.506479 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.511895 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mw6z4" event={"ID":"3062e734-0f07-4e8f-862e-a2906e7bbbd5","Type":"ContainerDied","Data":"6ad8139eca0f15fd8e30c7156fa7c4a3023f3f5ea90440d6cbdb5588237fada0"} Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.511931 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mw6z4" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.511940 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ad8139eca0f15fd8e30c7156fa7c4a3023f3f5ea90440d6cbdb5588237fada0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.528696 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" event={"ID":"3e84b517-ac89-462b-baae-559a25766f7e","Type":"ContainerDied","Data":"a0d632418ccc1d6b8dc1715c5034d92fac99aa4534c147c4a226f4938c9b0f58"} Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.528844 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-tr9f9" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.541118 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.541151 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f91334a-239f-4459-b885-aa9865bc6a04-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.541161 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.541169 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x29x2\" (UniqueName: \"kubernetes.io/projected/1f91334a-239f-4459-b885-aa9865bc6a04-kube-api-access-x29x2\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.555859 4740 scope.go:117] "RemoveContainer" containerID="4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.577933 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f91334a-239f-4459-b885-aa9865bc6a04" (UID: "1f91334a-239f-4459-b885-aa9865bc6a04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.586190 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-config-data" (OuterVolumeSpecName: "config-data") pod "1f91334a-239f-4459-b885-aa9865bc6a04" (UID: "1f91334a-239f-4459-b885-aa9865bc6a04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.588568 4740 scope.go:117] "RemoveContainer" containerID="d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.617166 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-tr9f9"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.625026 4740 scope.go:117] "RemoveContainer" containerID="c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.633894 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-tr9f9"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.645646 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.645682 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f91334a-239f-4459-b885-aa9865bc6a04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.651521 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.667016 4740 scope.go:117] "RemoveContainer" containerID="5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.668287 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6\": container with ID starting with 5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6 not found: ID does not exist" containerID="5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.668334 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6"} err="failed to get container status \"5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6\": rpc error: code = NotFound desc = could not find container \"5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6\": container with ID starting with 5e8ab2e51a7f22cab795ad0692ad557452c74d9ab2578ecabd20e6c21b8662d6 not found: ID does not exist" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.668363 4740 scope.go:117] "RemoveContainer" containerID="4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.668796 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425\": container with ID starting with 4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425 not found: ID does not exist" containerID="4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.668854 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425"} err="failed to get container status \"4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425\": rpc error: code = NotFound desc = could not find container \"4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425\": container with ID starting with 4cc409aa83e41e7b27b8e396f286646f4f957ab9d531b4d6914f61e55b438425 not found: ID does not exist" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.668889 4740 scope.go:117] "RemoveContainer" containerID="d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.669309 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13\": container with ID starting with d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13 not found: ID does not exist" containerID="d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.669362 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13"} err="failed to get container status \"d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13\": rpc error: code = NotFound desc = could not find container \"d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13\": container with ID starting with d4b67a6cd151baa83233a1bd259e38cc920f1cd7d6b272190590a2e057622e13 not found: ID does not exist" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.669395 4740 scope.go:117] "RemoveContainer" containerID="c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.669937 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d\": container with ID starting with c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d not found: ID does not exist" containerID="c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.669995 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d"} err="failed to get container status \"c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d\": rpc error: code = NotFound desc = could not find container \"c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d\": container with ID starting with c26954608c565ffe58564c831888fc6b660bc839108a510935b7f6da3bdfe85d not found: ID does not exist" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.670020 4740 scope.go:117] "RemoveContainer" containerID="6fa0c77057ccff8c499f02dc8114209b7aa046ee19a37dfaf9c5852a6a693f4a" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.693001 4740 scope.go:117] "RemoveContainer" containerID="37751f85cf80f7d38c7d643d59cc8ee715c78a318f2c9b516e7e2c741d6b58b4" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.773973 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e84b517-ac89-462b-baae-559a25766f7e" path="/var/lib/kubelet/pods/3e84b517-ac89-462b-baae-559a25766f7e/volumes" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.820978 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-rh279"] Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.821344 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3062e734-0f07-4e8f-862e-a2906e7bbbd5" containerName="cinder-db-sync" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821355 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3062e734-0f07-4e8f-862e-a2906e7bbbd5" containerName="cinder-db-sync" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.821379 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e84b517-ac89-462b-baae-559a25766f7e" containerName="init" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821385 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e84b517-ac89-462b-baae-559a25766f7e" containerName="init" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.821398 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="proxy-httpd" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821404 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="proxy-httpd" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.821419 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e84b517-ac89-462b-baae-559a25766f7e" containerName="dnsmasq-dns" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821424 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e84b517-ac89-462b-baae-559a25766f7e" containerName="dnsmasq-dns" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.821439 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="ceilometer-notification-agent" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821445 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="ceilometer-notification-agent" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.821458 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="ceilometer-central-agent" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821464 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="ceilometer-central-agent" Oct 09 10:45:33 crc kubenswrapper[4740]: E1009 10:45:33.821478 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="sg-core" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821483 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="sg-core" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821627 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="proxy-httpd" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821643 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3062e734-0f07-4e8f-862e-a2906e7bbbd5" containerName="cinder-db-sync" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821653 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e84b517-ac89-462b-baae-559a25766f7e" containerName="dnsmasq-dns" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821662 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="ceilometer-central-agent" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821673 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="ceilometer-notification-agent" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.821684 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" containerName="sg-core" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.823342 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.839891 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.841696 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.851119 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.851253 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.851307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-config\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.851331 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.851379 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.851411 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whhjf\" (UniqueName: \"kubernetes.io/projected/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-kube-api-access-whhjf\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.854231 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.854511 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.854627 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-89nf8" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.855426 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.861204 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.882188 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-rh279"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.890681 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.908617 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.930811 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.933445 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.936661 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.939604 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.951002 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.952814 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.952850 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.952874 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.952920 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rnn\" (UniqueName: \"kubernetes.io/projected/e83894ef-6c3b-4edd-ad72-e62cce53c34b-kube-api-access-z7rnn\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.952938 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-scripts\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.952963 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.952984 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g22ht\" (UniqueName: \"kubernetes.io/projected/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-kube-api-access-g22ht\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953219 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-config\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953268 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953302 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-config-data\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953323 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953349 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953398 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953437 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e83894ef-6c3b-4edd-ad72-e62cce53c34b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953452 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953474 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whhjf\" (UniqueName: \"kubernetes.io/projected/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-kube-api-access-whhjf\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953534 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-scripts\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953679 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.953709 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.954470 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.954813 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.955020 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-config\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.955386 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.969963 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.990510 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whhjf\" (UniqueName: \"kubernetes.io/projected/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-kube-api-access-whhjf\") pod \"dnsmasq-dns-69c986f6d7-rh279\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.995837 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:33 crc kubenswrapper[4740]: I1009 10:45:33.997465 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.000068 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.034839 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056422 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056472 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056494 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056511 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056540 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056567 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7rnn\" (UniqueName: \"kubernetes.io/projected/e83894ef-6c3b-4edd-ad72-e62cce53c34b-kube-api-access-z7rnn\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056585 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-scripts\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056610 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g22ht\" (UniqueName: \"kubernetes.io/projected/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-kube-api-access-g22ht\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056645 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056673 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056691 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-config-data\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056713 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056729 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-scripts\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056776 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e83894ef-6c3b-4edd-ad72-e62cce53c34b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056795 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056826 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-scripts\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056846 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056861 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae92657-8493-41a1-8981-416ed42e203d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056881 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rf9w\" (UniqueName: \"kubernetes.io/projected/0ae92657-8493-41a1-8981-416ed42e203d-kube-api-access-9rf9w\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056896 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae92657-8493-41a1-8981-416ed42e203d-logs\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.056973 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.057733 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.058354 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e83894ef-6c3b-4edd-ad72-e62cce53c34b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.063293 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.064645 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-scripts\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.065325 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.065549 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.065734 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-scripts\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.065968 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-config-data\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.069339 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.071157 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.075928 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7rnn\" (UniqueName: \"kubernetes.io/projected/e83894ef-6c3b-4edd-ad72-e62cce53c34b-kube-api-access-z7rnn\") pod \"cinder-scheduler-0\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.078067 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g22ht\" (UniqueName: \"kubernetes.io/projected/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-kube-api-access-g22ht\") pod \"ceilometer-0\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.144339 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.156869 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.157981 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-scripts\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.158135 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae92657-8493-41a1-8981-416ed42e203d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.158162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae92657-8493-41a1-8981-416ed42e203d-logs\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.158184 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rf9w\" (UniqueName: \"kubernetes.io/projected/0ae92657-8493-41a1-8981-416ed42e203d-kube-api-access-9rf9w\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.158222 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.158252 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.158262 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae92657-8493-41a1-8981-416ed42e203d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.158295 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.158703 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae92657-8493-41a1-8981-416ed42e203d-logs\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.161562 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.163280 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.172275 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.176477 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rf9w\" (UniqueName: \"kubernetes.io/projected/0ae92657-8493-41a1-8981-416ed42e203d-kube-api-access-9rf9w\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.179173 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.179605 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-scripts\") pod \"cinder-api-0\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.271965 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.323572 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.902473 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:34 crc kubenswrapper[4740]: I1009 10:45:34.917321 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-rh279"] Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.037709 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:45:35 crc kubenswrapper[4740]: W1009 10:45:35.043623 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeb3ea7a_c4b5_4f0d_b4e6_31c0699bd1b3.slice/crio-173d24bdf4169915afd0d52529d7e0f65d43a6610fc8d03619444cafa3ee7388 WatchSource:0}: Error finding container 173d24bdf4169915afd0d52529d7e0f65d43a6610fc8d03619444cafa3ee7388: Status 404 returned error can't find the container with id 173d24bdf4169915afd0d52529d7e0f65d43a6610fc8d03619444cafa3ee7388 Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.122257 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.589546 4740 generic.go:334] "Generic (PLEG): container finished" podID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" containerID="41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4" exitCode=0 Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.589775 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" event={"ID":"ee8ab1e0-ece4-4528-8609-7f56fb884ae8","Type":"ContainerDied","Data":"41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4"} Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.590013 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" event={"ID":"ee8ab1e0-ece4-4528-8609-7f56fb884ae8","Type":"ContainerStarted","Data":"5d8ac3b1fc2ae5fed0cfa69934f96e42324b436669e5c53b52e976d9ee6006c4"} Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.596904 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ae92657-8493-41a1-8981-416ed42e203d","Type":"ContainerStarted","Data":"823299080ae702a2b3a225d7b33e85d49556293fe4e0ff0c192a078670a0b0ed"} Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.606070 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerStarted","Data":"173d24bdf4169915afd0d52529d7e0f65d43a6610fc8d03619444cafa3ee7388"} Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.622229 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e83894ef-6c3b-4edd-ad72-e62cce53c34b","Type":"ContainerStarted","Data":"d619fddd70cb763c926823faa5273b3a84ef41753450a4c67204cd7cd492541e"} Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.778158 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f91334a-239f-4459-b885-aa9865bc6a04" path="/var/lib/kubelet/pods/1f91334a-239f-4459-b885-aa9865bc6a04/volumes" Oct 09 10:45:35 crc kubenswrapper[4740]: I1009 10:45:35.778927 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:36 crc kubenswrapper[4740]: I1009 10:45:36.080718 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:45:36 crc kubenswrapper[4740]: I1009 10:45:36.250086 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:45:36 crc kubenswrapper[4740]: I1009 10:45:36.654063 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ae92657-8493-41a1-8981-416ed42e203d","Type":"ContainerStarted","Data":"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64"} Oct 09 10:45:36 crc kubenswrapper[4740]: I1009 10:45:36.657325 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerStarted","Data":"8fad223b9b5380edb8648535ccd0110e972a07409b1f00e7005dcda2620179b8"} Oct 09 10:45:36 crc kubenswrapper[4740]: I1009 10:45:36.659885 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" event={"ID":"ee8ab1e0-ece4-4528-8609-7f56fb884ae8","Type":"ContainerStarted","Data":"3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e"} Oct 09 10:45:36 crc kubenswrapper[4740]: I1009 10:45:36.660083 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:36 crc kubenswrapper[4740]: I1009 10:45:36.731988 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" podStartSLOduration=3.731970589 podStartE2EDuration="3.731970589s" podCreationTimestamp="2025-10-09 10:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:36.724635281 +0000 UTC m=+1075.686835672" watchObservedRunningTime="2025-10-09 10:45:36.731970589 +0000 UTC m=+1075.694170970" Oct 09 10:45:37 crc kubenswrapper[4740]: I1009 10:45:37.095340 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:37 crc kubenswrapper[4740]: I1009 10:45:37.681306 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerStarted","Data":"1234ce3da3be36738b380320cb596846d7cde22c4335b42f547bae728cd1b611"} Oct 09 10:45:37 crc kubenswrapper[4740]: I1009 10:45:37.695220 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e83894ef-6c3b-4edd-ad72-e62cce53c34b","Type":"ContainerStarted","Data":"4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7"} Oct 09 10:45:37 crc kubenswrapper[4740]: I1009 10:45:37.706212 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ae92657-8493-41a1-8981-416ed42e203d","Type":"ContainerStarted","Data":"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63"} Oct 09 10:45:37 crc kubenswrapper[4740]: I1009 10:45:37.706243 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0ae92657-8493-41a1-8981-416ed42e203d" containerName="cinder-api-log" containerID="cri-o://b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64" gracePeriod=30 Oct 09 10:45:37 crc kubenswrapper[4740]: I1009 10:45:37.706346 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0ae92657-8493-41a1-8981-416ed42e203d" containerName="cinder-api" containerID="cri-o://681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63" gracePeriod=30 Oct 09 10:45:37 crc kubenswrapper[4740]: I1009 10:45:37.706772 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 10:45:37 crc kubenswrapper[4740]: I1009 10:45:37.739042 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.739027068 podStartE2EDuration="4.739027068s" podCreationTimestamp="2025-10-09 10:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:37.734566747 +0000 UTC m=+1076.696767128" watchObservedRunningTime="2025-10-09 10:45:37.739027068 +0000 UTC m=+1076.701227449" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.542294 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.582848 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-combined-ca-bundle\") pod \"0ae92657-8493-41a1-8981-416ed42e203d\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.582947 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-scripts\") pod \"0ae92657-8493-41a1-8981-416ed42e203d\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.583004 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae92657-8493-41a1-8981-416ed42e203d-logs\") pod \"0ae92657-8493-41a1-8981-416ed42e203d\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.583033 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rf9w\" (UniqueName: \"kubernetes.io/projected/0ae92657-8493-41a1-8981-416ed42e203d-kube-api-access-9rf9w\") pod \"0ae92657-8493-41a1-8981-416ed42e203d\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.583152 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae92657-8493-41a1-8981-416ed42e203d-etc-machine-id\") pod \"0ae92657-8493-41a1-8981-416ed42e203d\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.583212 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data-custom\") pod \"0ae92657-8493-41a1-8981-416ed42e203d\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.583266 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data\") pod \"0ae92657-8493-41a1-8981-416ed42e203d\" (UID: \"0ae92657-8493-41a1-8981-416ed42e203d\") " Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.586832 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ae92657-8493-41a1-8981-416ed42e203d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0ae92657-8493-41a1-8981-416ed42e203d" (UID: "0ae92657-8493-41a1-8981-416ed42e203d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.592384 4740 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae92657-8493-41a1-8981-416ed42e203d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.592905 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ae92657-8493-41a1-8981-416ed42e203d" (UID: "0ae92657-8493-41a1-8981-416ed42e203d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.597970 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae92657-8493-41a1-8981-416ed42e203d-kube-api-access-9rf9w" (OuterVolumeSpecName: "kube-api-access-9rf9w") pod "0ae92657-8493-41a1-8981-416ed42e203d" (UID: "0ae92657-8493-41a1-8981-416ed42e203d"). InnerVolumeSpecName "kube-api-access-9rf9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.598088 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-scripts" (OuterVolumeSpecName: "scripts") pod "0ae92657-8493-41a1-8981-416ed42e203d" (UID: "0ae92657-8493-41a1-8981-416ed42e203d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.601559 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae92657-8493-41a1-8981-416ed42e203d-logs" (OuterVolumeSpecName: "logs") pod "0ae92657-8493-41a1-8981-416ed42e203d" (UID: "0ae92657-8493-41a1-8981-416ed42e203d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.628333 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ae92657-8493-41a1-8981-416ed42e203d" (UID: "0ae92657-8493-41a1-8981-416ed42e203d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.662385 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data" (OuterVolumeSpecName: "config-data") pod "0ae92657-8493-41a1-8981-416ed42e203d" (UID: "0ae92657-8493-41a1-8981-416ed42e203d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.694063 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.694112 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.694124 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.694136 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae92657-8493-41a1-8981-416ed42e203d-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.694174 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae92657-8493-41a1-8981-416ed42e203d-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.694185 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rf9w\" (UniqueName: \"kubernetes.io/projected/0ae92657-8493-41a1-8981-416ed42e203d-kube-api-access-9rf9w\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.720542 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerStarted","Data":"02d2855788adc8a6b430634bc0dc2ad436c644505ffb8ad4dbfc22e0c9ba8b19"} Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.722276 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e83894ef-6c3b-4edd-ad72-e62cce53c34b","Type":"ContainerStarted","Data":"3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2"} Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.725957 4740 generic.go:334] "Generic (PLEG): container finished" podID="0ae92657-8493-41a1-8981-416ed42e203d" containerID="681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63" exitCode=0 Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.725986 4740 generic.go:334] "Generic (PLEG): container finished" podID="0ae92657-8493-41a1-8981-416ed42e203d" containerID="b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64" exitCode=143 Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.726014 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ae92657-8493-41a1-8981-416ed42e203d","Type":"ContainerDied","Data":"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63"} Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.726043 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ae92657-8493-41a1-8981-416ed42e203d","Type":"ContainerDied","Data":"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64"} Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.726057 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ae92657-8493-41a1-8981-416ed42e203d","Type":"ContainerDied","Data":"823299080ae702a2b3a225d7b33e85d49556293fe4e0ff0c192a078670a0b0ed"} Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.726065 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.726084 4740 scope.go:117] "RemoveContainer" containerID="681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.740886 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.763478489 podStartE2EDuration="5.740870424s" podCreationTimestamp="2025-10-09 10:45:33 +0000 UTC" firstStartedPulling="2025-10-09 10:45:34.89425477 +0000 UTC m=+1073.856455151" lastFinishedPulling="2025-10-09 10:45:35.871646705 +0000 UTC m=+1074.833847086" observedRunningTime="2025-10-09 10:45:38.740087953 +0000 UTC m=+1077.702288334" watchObservedRunningTime="2025-10-09 10:45:38.740870424 +0000 UTC m=+1077.703070795" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.764931 4740 scope.go:117] "RemoveContainer" containerID="b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.776734 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.785947 4740 scope.go:117] "RemoveContainer" containerID="681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63" Oct 09 10:45:38 crc kubenswrapper[4740]: E1009 10:45:38.786494 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63\": container with ID starting with 681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63 not found: ID does not exist" containerID="681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.786531 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63"} err="failed to get container status \"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63\": rpc error: code = NotFound desc = could not find container \"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63\": container with ID starting with 681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63 not found: ID does not exist" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.786550 4740 scope.go:117] "RemoveContainer" containerID="b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64" Oct 09 10:45:38 crc kubenswrapper[4740]: E1009 10:45:38.786720 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64\": container with ID starting with b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64 not found: ID does not exist" containerID="b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.786741 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64"} err="failed to get container status \"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64\": rpc error: code = NotFound desc = could not find container \"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64\": container with ID starting with b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64 not found: ID does not exist" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.786756 4740 scope.go:117] "RemoveContainer" containerID="681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.788109 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63"} err="failed to get container status \"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63\": rpc error: code = NotFound desc = could not find container \"681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63\": container with ID starting with 681d5cc39ea18db33fc0725a815242933a8209e01e0539fb8f08e55e411cdc63 not found: ID does not exist" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.788129 4740 scope.go:117] "RemoveContainer" containerID="b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.788280 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64"} err="failed to get container status \"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64\": rpc error: code = NotFound desc = could not find container \"b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64\": container with ID starting with b6d64076869c28aa2e72a491f0a1fb54fef94402fe8c4ccf51fe58395772ff64 not found: ID does not exist" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.803161 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.809388 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:38 crc kubenswrapper[4740]: E1009 10:45:38.809808 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae92657-8493-41a1-8981-416ed42e203d" containerName="cinder-api-log" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.809823 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae92657-8493-41a1-8981-416ed42e203d" containerName="cinder-api-log" Oct 09 10:45:38 crc kubenswrapper[4740]: E1009 10:45:38.809845 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae92657-8493-41a1-8981-416ed42e203d" containerName="cinder-api" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.809851 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae92657-8493-41a1-8981-416ed42e203d" containerName="cinder-api" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.810015 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae92657-8493-41a1-8981-416ed42e203d" containerName="cinder-api" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.810044 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae92657-8493-41a1-8981-416ed42e203d" containerName="cinder-api-log" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.810954 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.814181 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.814235 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.814194 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.815999 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.864038 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.879665 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5dd4b95776-lcxbt" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898037 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56b8e346-13ed-4f64-88af-13be77ceddfa-logs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898108 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898134 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898154 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898221 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-config-data\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898238 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898327 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq5km\" (UniqueName: \"kubernetes.io/projected/56b8e346-13ed-4f64-88af-13be77ceddfa-kube-api-access-sq5km\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898392 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-scripts\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.898413 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56b8e346-13ed-4f64-88af-13be77ceddfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.944333 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f67cbf644-2n99k"] Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.999896 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-scripts\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:38 crc kubenswrapper[4740]: I1009 10:45:38.999945 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56b8e346-13ed-4f64-88af-13be77ceddfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:38.999966 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56b8e346-13ed-4f64-88af-13be77ceddfa-logs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.000002 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.000027 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.000044 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.000087 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-config-data\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.000109 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.000178 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq5km\" (UniqueName: \"kubernetes.io/projected/56b8e346-13ed-4f64-88af-13be77ceddfa-kube-api-access-sq5km\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.000587 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56b8e346-13ed-4f64-88af-13be77ceddfa-logs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.005851 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56b8e346-13ed-4f64-88af-13be77ceddfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.009865 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.011707 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.013241 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-scripts\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.026099 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq5km\" (UniqueName: \"kubernetes.io/projected/56b8e346-13ed-4f64-88af-13be77ceddfa-kube-api-access-sq5km\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.027328 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.027393 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-config-data\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.027939 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b8e346-13ed-4f64-88af-13be77ceddfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"56b8e346-13ed-4f64-88af-13be77ceddfa\") " pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.141802 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.182977 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.673181 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 10:45:39 crc kubenswrapper[4740]: W1009 10:45:39.673891 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56b8e346_13ed_4f64_88af_13be77ceddfa.slice/crio-72bff7b1985eb03b77f38f9566d5b4de003b9dcb0926f2dd4ed4083548ccf492 WatchSource:0}: Error finding container 72bff7b1985eb03b77f38f9566d5b4de003b9dcb0926f2dd4ed4083548ccf492: Status 404 returned error can't find the container with id 72bff7b1985eb03b77f38f9566d5b4de003b9dcb0926f2dd4ed4083548ccf492 Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.752081 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56b8e346-13ed-4f64-88af-13be77ceddfa","Type":"ContainerStarted","Data":"72bff7b1985eb03b77f38f9566d5b4de003b9dcb0926f2dd4ed4083548ccf492"} Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.752380 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f67cbf644-2n99k" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon-log" containerID="cri-o://f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a" gracePeriod=30 Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.752818 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f67cbf644-2n99k" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon" containerID="cri-o://bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416" gracePeriod=30 Oct 09 10:45:39 crc kubenswrapper[4740]: I1009 10:45:39.787412 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae92657-8493-41a1-8981-416ed42e203d" path="/var/lib/kubelet/pods/0ae92657-8493-41a1-8981-416ed42e203d/volumes" Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.131339 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-544757df48-b9dz7" Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.186841 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d7f9c54-kwrl7"] Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.187047 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d7f9c54-kwrl7" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api-log" containerID="cri-o://a98c21eed4d0e1bdbab9c4f4a2d8a3a69aaf2528ff6c4a613504a7a0684f43f2" gracePeriod=30 Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.187406 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d7f9c54-kwrl7" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api" containerID="cri-o://69c62d42a35a352f169fbf0778bd61ee3bbb732992914b48d25dd7792b562edd" gracePeriod=30 Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.767211 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerStarted","Data":"dc54c526fa37681bf9b2356a7bd9ecff678f102e8f84d10425927807b3ae3379"} Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.767592 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.780775 4740 generic.go:334] "Generic (PLEG): container finished" podID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerID="a98c21eed4d0e1bdbab9c4f4a2d8a3a69aaf2528ff6c4a613504a7a0684f43f2" exitCode=143 Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.780805 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7f9c54-kwrl7" event={"ID":"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc","Type":"ContainerDied","Data":"a98c21eed4d0e1bdbab9c4f4a2d8a3a69aaf2528ff6c4a613504a7a0684f43f2"} Oct 09 10:45:40 crc kubenswrapper[4740]: I1009 10:45:40.790465 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56b8e346-13ed-4f64-88af-13be77ceddfa","Type":"ContainerStarted","Data":"d277b3039966166a911d7336f9e0f8ea4d2c60de102c42eae4287ad7477c4866"} Oct 09 10:45:41 crc kubenswrapper[4740]: I1009 10:45:41.795627 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.469463855 podStartE2EDuration="8.795605258s" podCreationTimestamp="2025-10-09 10:45:33 +0000 UTC" firstStartedPulling="2025-10-09 10:45:35.047604724 +0000 UTC m=+1074.009805105" lastFinishedPulling="2025-10-09 10:45:39.373746137 +0000 UTC m=+1078.335946508" observedRunningTime="2025-10-09 10:45:40.798950672 +0000 UTC m=+1079.761151043" watchObservedRunningTime="2025-10-09 10:45:41.795605258 +0000 UTC m=+1080.757805649" Oct 09 10:45:41 crc kubenswrapper[4740]: I1009 10:45:41.821242 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56b8e346-13ed-4f64-88af-13be77ceddfa","Type":"ContainerStarted","Data":"f0a45566f234753660ca472940f897ce98ff6e33c8c2872146869be5b44ee84a"} Oct 09 10:45:41 crc kubenswrapper[4740]: I1009 10:45:41.822937 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 10:45:41 crc kubenswrapper[4740]: I1009 10:45:41.841378 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.841359938 podStartE2EDuration="3.841359938s" podCreationTimestamp="2025-10-09 10:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:41.839724803 +0000 UTC m=+1080.801925184" watchObservedRunningTime="2025-10-09 10:45:41.841359938 +0000 UTC m=+1080.803560319" Oct 09 10:45:42 crc kubenswrapper[4740]: I1009 10:45:42.834177 4740 generic.go:334] "Generic (PLEG): container finished" podID="fb96e05e-80fb-4eec-b609-123ed43152ae" containerID="08a803f91f066820fd59c7e3e405a1c4934c77c751a2c0dc019586ab31cbf341" exitCode=0 Oct 09 10:45:42 crc kubenswrapper[4740]: I1009 10:45:42.834256 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f8w6m" event={"ID":"fb96e05e-80fb-4eec-b609-123ed43152ae","Type":"ContainerDied","Data":"08a803f91f066820fd59c7e3e405a1c4934c77c751a2c0dc019586ab31cbf341"} Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.475837 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d7f9c54-kwrl7" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:42314->10.217.0.160:9311: read: connection reset by peer" Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.475950 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d7f9c54-kwrl7" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:42312->10.217.0.160:9311: read: connection reset by peer" Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.516823 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f67cbf644-2n99k" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.845350 4740 generic.go:334] "Generic (PLEG): container finished" podID="5d46647a-6230-4561-bd21-a433ed55dad2" containerID="bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416" exitCode=0 Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.845420 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f67cbf644-2n99k" event={"ID":"5d46647a-6230-4561-bd21-a433ed55dad2","Type":"ContainerDied","Data":"bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416"} Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.848343 4740 generic.go:334] "Generic (PLEG): container finished" podID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerID="69c62d42a35a352f169fbf0778bd61ee3bbb732992914b48d25dd7792b562edd" exitCode=0 Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.848423 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7f9c54-kwrl7" event={"ID":"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc","Type":"ContainerDied","Data":"69c62d42a35a352f169fbf0778bd61ee3bbb732992914b48d25dd7792b562edd"} Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.848456 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7f9c54-kwrl7" event={"ID":"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc","Type":"ContainerDied","Data":"e02cfe8410a4c220df7037af22d112534209ac23e44c34d130ee98f453660ada"} Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.848467 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e02cfe8410a4c220df7037af22d112534209ac23e44c34d130ee98f453660ada" Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.887012 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.990216 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-logs\") pod \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.990433 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data-custom\") pod \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.990463 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-combined-ca-bundle\") pod \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.990515 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgvn6\" (UniqueName: \"kubernetes.io/projected/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-kube-api-access-jgvn6\") pod \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.990580 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data\") pod \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\" (UID: \"28a64be3-4f9f-4ef1-830d-fb0d8919e1bc\") " Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.993275 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-logs" (OuterVolumeSpecName: "logs") pod "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" (UID: "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:45:43 crc kubenswrapper[4740]: I1009 10:45:43.996954 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" (UID: "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.003984 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-kube-api-access-jgvn6" (OuterVolumeSpecName: "kube-api-access-jgvn6") pod "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" (UID: "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc"). InnerVolumeSpecName "kube-api-access-jgvn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.042958 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" (UID: "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.092973 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.093010 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.093023 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgvn6\" (UniqueName: \"kubernetes.io/projected/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-kube-api-access-jgvn6\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.093037 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.111026 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data" (OuterVolumeSpecName: "config-data") pod "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" (UID: "28a64be3-4f9f-4ef1-830d-fb0d8919e1bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.159003 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.203107 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.233290 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.254643 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-cxphd"] Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.254915 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" podUID="ddf10f98-4990-4b57-b586-8c47fcf5993e" containerName="dnsmasq-dns" containerID="cri-o://94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77" gracePeriod=10 Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.306097 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-combined-ca-bundle\") pod \"fb96e05e-80fb-4eec-b609-123ed43152ae\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.306218 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-config\") pod \"fb96e05e-80fb-4eec-b609-123ed43152ae\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.306247 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxkjc\" (UniqueName: \"kubernetes.io/projected/fb96e05e-80fb-4eec-b609-123ed43152ae-kube-api-access-jxkjc\") pod \"fb96e05e-80fb-4eec-b609-123ed43152ae\" (UID: \"fb96e05e-80fb-4eec-b609-123ed43152ae\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.314371 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb96e05e-80fb-4eec-b609-123ed43152ae-kube-api-access-jxkjc" (OuterVolumeSpecName: "kube-api-access-jxkjc") pod "fb96e05e-80fb-4eec-b609-123ed43152ae" (UID: "fb96e05e-80fb-4eec-b609-123ed43152ae"). InnerVolumeSpecName "kube-api-access-jxkjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.339207 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb96e05e-80fb-4eec-b609-123ed43152ae" (UID: "fb96e05e-80fb-4eec-b609-123ed43152ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.358368 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-config" (OuterVolumeSpecName: "config") pod "fb96e05e-80fb-4eec-b609-123ed43152ae" (UID: "fb96e05e-80fb-4eec-b609-123ed43152ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.408416 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.408463 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxkjc\" (UniqueName: \"kubernetes.io/projected/fb96e05e-80fb-4eec-b609-123ed43152ae-kube-api-access-jxkjc\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.408480 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb96e05e-80fb-4eec-b609-123ed43152ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.651207 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.676286 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.696832 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.713488 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-sb\") pod \"ddf10f98-4990-4b57-b586-8c47fcf5993e\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.713604 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-swift-storage-0\") pod \"ddf10f98-4990-4b57-b586-8c47fcf5993e\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.713627 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-svc\") pod \"ddf10f98-4990-4b57-b586-8c47fcf5993e\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.781059 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ddf10f98-4990-4b57-b586-8c47fcf5993e" (UID: "ddf10f98-4990-4b57-b586-8c47fcf5993e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.798212 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ddf10f98-4990-4b57-b586-8c47fcf5993e" (UID: "ddf10f98-4990-4b57-b586-8c47fcf5993e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.801350 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddf10f98-4990-4b57-b586-8c47fcf5993e" (UID: "ddf10f98-4990-4b57-b586-8c47fcf5993e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.815665 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-config\") pod \"ddf10f98-4990-4b57-b586-8c47fcf5993e\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.816116 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-nb\") pod \"ddf10f98-4990-4b57-b586-8c47fcf5993e\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.816267 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdh4g\" (UniqueName: \"kubernetes.io/projected/ddf10f98-4990-4b57-b586-8c47fcf5993e-kube-api-access-tdh4g\") pod \"ddf10f98-4990-4b57-b586-8c47fcf5993e\" (UID: \"ddf10f98-4990-4b57-b586-8c47fcf5993e\") " Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.817867 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.817959 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.818027 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.821919 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf10f98-4990-4b57-b586-8c47fcf5993e-kube-api-access-tdh4g" (OuterVolumeSpecName: "kube-api-access-tdh4g") pod "ddf10f98-4990-4b57-b586-8c47fcf5993e" (UID: "ddf10f98-4990-4b57-b586-8c47fcf5993e"). InnerVolumeSpecName "kube-api-access-tdh4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.869887 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-config" (OuterVolumeSpecName: "config") pod "ddf10f98-4990-4b57-b586-8c47fcf5993e" (UID: "ddf10f98-4990-4b57-b586-8c47fcf5993e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.872735 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f8w6m" event={"ID":"fb96e05e-80fb-4eec-b609-123ed43152ae","Type":"ContainerDied","Data":"a0e92c541c9f5d8bd40bdc61f0cf213dccde0034b31baed8a1e9cc213ebd1eaf"} Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.872783 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f8w6m" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.872801 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0e92c541c9f5d8bd40bdc61f0cf213dccde0034b31baed8a1e9cc213ebd1eaf" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.877998 4740 generic.go:334] "Generic (PLEG): container finished" podID="ddf10f98-4990-4b57-b586-8c47fcf5993e" containerID="94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77" exitCode=0 Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.878098 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d7f9c54-kwrl7" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.878343 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" event={"ID":"ddf10f98-4990-4b57-b586-8c47fcf5993e","Type":"ContainerDied","Data":"94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77"} Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.878382 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" event={"ID":"ddf10f98-4990-4b57-b586-8c47fcf5993e","Type":"ContainerDied","Data":"a8ba68c58dc3c05c9f5f2fef26fa36041eebb3a99df864396afb493b58878ff8"} Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.878402 4740 scope.go:117] "RemoveContainer" containerID="94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.878513 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-cxphd" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.878792 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerName="cinder-scheduler" containerID="cri-o://4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7" gracePeriod=30 Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.878882 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerName="probe" containerID="cri-o://3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2" gracePeriod=30 Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.899617 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ddf10f98-4990-4b57-b586-8c47fcf5993e" (UID: "ddf10f98-4990-4b57-b586-8c47fcf5993e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.907651 4740 scope.go:117] "RemoveContainer" containerID="006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.929139 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.929249 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddf10f98-4990-4b57-b586-8c47fcf5993e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.929264 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdh4g\" (UniqueName: \"kubernetes.io/projected/ddf10f98-4990-4b57-b586-8c47fcf5993e-kube-api-access-tdh4g\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.935147 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d7f9c54-kwrl7"] Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.945274 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7d7f9c54-kwrl7"] Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.963441 4740 scope.go:117] "RemoveContainer" containerID="94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77" Oct 09 10:45:44 crc kubenswrapper[4740]: E1009 10:45:44.964040 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77\": container with ID starting with 94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77 not found: ID does not exist" containerID="94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.964091 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77"} err="failed to get container status \"94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77\": rpc error: code = NotFound desc = could not find container \"94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77\": container with ID starting with 94da4d88ef1c2e3c92a065fcc00c74247fe46254025bff6b7ba065c480db2b77 not found: ID does not exist" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.964124 4740 scope.go:117] "RemoveContainer" containerID="006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d" Oct 09 10:45:44 crc kubenswrapper[4740]: E1009 10:45:44.964567 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d\": container with ID starting with 006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d not found: ID does not exist" containerID="006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d" Oct 09 10:45:44 crc kubenswrapper[4740]: I1009 10:45:44.964598 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d"} err="failed to get container status \"006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d\": rpc error: code = NotFound desc = could not find container \"006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d\": container with ID starting with 006f82e0c33141764dac4804b55006cd6eb9b7659dadac797f5366984ea94a3d not found: ID does not exist" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.069474 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-8wzch"] Oct 09 10:45:45 crc kubenswrapper[4740]: E1009 10:45:45.070009 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf10f98-4990-4b57-b586-8c47fcf5993e" containerName="init" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070028 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf10f98-4990-4b57-b586-8c47fcf5993e" containerName="init" Oct 09 10:45:45 crc kubenswrapper[4740]: E1009 10:45:45.070072 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070083 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api" Oct 09 10:45:45 crc kubenswrapper[4740]: E1009 10:45:45.070100 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf10f98-4990-4b57-b586-8c47fcf5993e" containerName="dnsmasq-dns" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070108 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf10f98-4990-4b57-b586-8c47fcf5993e" containerName="dnsmasq-dns" Oct 09 10:45:45 crc kubenswrapper[4740]: E1009 10:45:45.070126 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb96e05e-80fb-4eec-b609-123ed43152ae" containerName="neutron-db-sync" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070134 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb96e05e-80fb-4eec-b609-123ed43152ae" containerName="neutron-db-sync" Oct 09 10:45:45 crc kubenswrapper[4740]: E1009 10:45:45.070146 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api-log" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070153 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api-log" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070346 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070381 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" containerName="barbican-api-log" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070396 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf10f98-4990-4b57-b586-8c47fcf5993e" containerName="dnsmasq-dns" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.070420 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb96e05e-80fb-4eec-b609-123ed43152ae" containerName="neutron-db-sync" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.071328 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.090435 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-8wzch"] Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.132231 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.132312 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-svc\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.132387 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpszj\" (UniqueName: \"kubernetes.io/projected/6c511056-3872-4b33-92b4-51ff3fa0d287-kube-api-access-cpszj\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.132430 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-config\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.132492 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.132521 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.201828 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5f54d79f5b-tsjf9"] Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.222461 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f54d79f5b-tsjf9"] Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.222563 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.225106 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.225283 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-shv4j" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.225536 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.225682 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235010 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-combined-ca-bundle\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235070 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-config\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235102 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235124 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-httpd-config\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235142 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235187 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235218 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-svc\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235249 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-ovndb-tls-certs\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235286 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpszj\" (UniqueName: \"kubernetes.io/projected/6c511056-3872-4b33-92b4-51ff3fa0d287-kube-api-access-cpszj\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235313 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-config\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.235332 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csnj2\" (UniqueName: \"kubernetes.io/projected/3a555e46-61f3-4a67-9d18-9acf13b859f7-kube-api-access-csnj2\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.236674 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.236719 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-svc\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.237533 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-config\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.237592 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-cxphd"] Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.239274 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.239427 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.245675 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-cxphd"] Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.255430 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpszj\" (UniqueName: \"kubernetes.io/projected/6c511056-3872-4b33-92b4-51ff3fa0d287-kube-api-access-cpszj\") pod \"dnsmasq-dns-5784cf869f-8wzch\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.336692 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-ovndb-tls-certs\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.337108 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csnj2\" (UniqueName: \"kubernetes.io/projected/3a555e46-61f3-4a67-9d18-9acf13b859f7-kube-api-access-csnj2\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.337144 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-combined-ca-bundle\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.337177 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-config\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.337215 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-httpd-config\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.341083 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-ovndb-tls-certs\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.344572 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-config\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.345318 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-combined-ca-bundle\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.346419 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-httpd-config\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.363781 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csnj2\" (UniqueName: \"kubernetes.io/projected/3a555e46-61f3-4a67-9d18-9acf13b859f7-kube-api-access-csnj2\") pod \"neutron-5f54d79f5b-tsjf9\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.394436 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.540189 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.766183 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a64be3-4f9f-4ef1-830d-fb0d8919e1bc" path="/var/lib/kubelet/pods/28a64be3-4f9f-4ef1-830d-fb0d8919e1bc/volumes" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.767219 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf10f98-4990-4b57-b586-8c47fcf5993e" path="/var/lib/kubelet/pods/ddf10f98-4990-4b57-b586-8c47fcf5993e/volumes" Oct 09 10:45:45 crc kubenswrapper[4740]: I1009 10:45:45.946241 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-8wzch"] Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.067723 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:46 crc kubenswrapper[4740]: W1009 10:45:46.162431 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a555e46_61f3_4a67_9d18_9acf13b859f7.slice/crio-540440658e413ab2a730b8b0ac3726c8100b2b239c2af74077fdaa83ac5d389b WatchSource:0}: Error finding container 540440658e413ab2a730b8b0ac3726c8100b2b239c2af74077fdaa83ac5d389b: Status 404 returned error can't find the container with id 540440658e413ab2a730b8b0ac3726c8100b2b239c2af74077fdaa83ac5d389b Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.162744 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55c77867db-hsc8q" Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.164816 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f54d79f5b-tsjf9"] Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.896254 4740 generic.go:334] "Generic (PLEG): container finished" podID="6c511056-3872-4b33-92b4-51ff3fa0d287" containerID="833b938ba95c6ead698b57005a7c07cca71da6bade145296713a10e9dc89b413" exitCode=0 Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.896353 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" event={"ID":"6c511056-3872-4b33-92b4-51ff3fa0d287","Type":"ContainerDied","Data":"833b938ba95c6ead698b57005a7c07cca71da6bade145296713a10e9dc89b413"} Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.896623 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" event={"ID":"6c511056-3872-4b33-92b4-51ff3fa0d287","Type":"ContainerStarted","Data":"482603ad9e072e7942b6ca67d60bbe30ad1d2f7b4096b8cd98fd9960e33eedee"} Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.898568 4740 generic.go:334] "Generic (PLEG): container finished" podID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerID="3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2" exitCode=0 Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.898914 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e83894ef-6c3b-4edd-ad72-e62cce53c34b","Type":"ContainerDied","Data":"3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2"} Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.901891 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f54d79f5b-tsjf9" event={"ID":"3a555e46-61f3-4a67-9d18-9acf13b859f7","Type":"ContainerStarted","Data":"101e0a0b0c0e372ef0da471fe6744819ea5832156da9489ca62d0b38ba0b893a"} Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.901935 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f54d79f5b-tsjf9" event={"ID":"3a555e46-61f3-4a67-9d18-9acf13b859f7","Type":"ContainerStarted","Data":"e3247a241eda0c21d93b7a8f1fdd98f5d0ab875cb2dec08ca673ec9a6f900664"} Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.901950 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f54d79f5b-tsjf9" event={"ID":"3a555e46-61f3-4a67-9d18-9acf13b859f7","Type":"ContainerStarted","Data":"540440658e413ab2a730b8b0ac3726c8100b2b239c2af74077fdaa83ac5d389b"} Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.902134 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:45:46 crc kubenswrapper[4740]: I1009 10:45:46.953075 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5f54d79f5b-tsjf9" podStartSLOduration=1.9530525810000001 podStartE2EDuration="1.953052581s" podCreationTimestamp="2025-10-09 10:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:46.942209388 +0000 UTC m=+1085.904409769" watchObservedRunningTime="2025-10-09 10:45:46.953052581 +0000 UTC m=+1085.915252952" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.478458 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d74f6589-zvlln"] Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.480498 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.482523 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.491313 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d74f6589-zvlln"] Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.492818 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.586224 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-ovndb-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.586311 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-config\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.586415 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-httpd-config\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.586505 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-combined-ca-bundle\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.586673 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-internal-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.586915 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-public-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.586952 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqbth\" (UniqueName: \"kubernetes.io/projected/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-kube-api-access-sqbth\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.688146 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-internal-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.688246 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-public-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.688268 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqbth\" (UniqueName: \"kubernetes.io/projected/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-kube-api-access-sqbth\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.688301 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-ovndb-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.688355 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-config\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.688373 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-httpd-config\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.688396 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-combined-ca-bundle\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.693934 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-combined-ca-bundle\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.694034 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-httpd-config\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.694654 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-internal-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.695499 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-config\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.696712 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-ovndb-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.701072 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-public-tls-certs\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.717835 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqbth\" (UniqueName: \"kubernetes.io/projected/94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19-kube-api-access-sqbth\") pod \"neutron-d74f6589-zvlln\" (UID: \"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19\") " pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.796337 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.925950 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" event={"ID":"6c511056-3872-4b33-92b4-51ff3fa0d287","Type":"ContainerStarted","Data":"7ca0132ed94fddc367215f2a02e9ca5e33317f527172ccaf94b42c83456bb904"} Oct 09 10:45:47 crc kubenswrapper[4740]: I1009 10:45:47.926318 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:48 crc kubenswrapper[4740]: I1009 10:45:48.378629 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" podStartSLOduration=3.378613836 podStartE2EDuration="3.378613836s" podCreationTimestamp="2025-10-09 10:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:47.956196194 +0000 UTC m=+1086.918396595" watchObservedRunningTime="2025-10-09 10:45:48.378613836 +0000 UTC m=+1087.340814217" Oct 09 10:45:48 crc kubenswrapper[4740]: I1009 10:45:48.383234 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d74f6589-zvlln"] Oct 09 10:45:48 crc kubenswrapper[4740]: I1009 10:45:48.945951 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74f6589-zvlln" event={"ID":"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19","Type":"ContainerStarted","Data":"2414ca1133a0f95252c86968933da548758fd68b7da1a2b7a610d740df5ffe3c"} Oct 09 10:45:48 crc kubenswrapper[4740]: I1009 10:45:48.946232 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74f6589-zvlln" event={"ID":"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19","Type":"ContainerStarted","Data":"f13c78285b12ab732182c6301adeeb7af22b3371a7ae95600c02aff20399e26f"} Oct 09 10:45:48 crc kubenswrapper[4740]: I1009 10:45:48.946243 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74f6589-zvlln" event={"ID":"94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19","Type":"ContainerStarted","Data":"fcab14202e7b34bb8691d8174d11f4493a47eb4897d06f5255cc4dd77285654f"} Oct 09 10:45:48 crc kubenswrapper[4740]: I1009 10:45:48.946737 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:45:48 crc kubenswrapper[4740]: I1009 10:45:48.981340 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d74f6589-zvlln" podStartSLOduration=1.981318581 podStartE2EDuration="1.981318581s" podCreationTimestamp="2025-10-09 10:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:48.971229288 +0000 UTC m=+1087.933429699" watchObservedRunningTime="2025-10-09 10:45:48.981318581 +0000 UTC m=+1087.943518972" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.407362 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.521578 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-scripts\") pod \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.521666 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-combined-ca-bundle\") pod \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.521728 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data\") pod \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.521745 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data-custom\") pod \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.521808 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7rnn\" (UniqueName: \"kubernetes.io/projected/e83894ef-6c3b-4edd-ad72-e62cce53c34b-kube-api-access-z7rnn\") pod \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.521837 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e83894ef-6c3b-4edd-ad72-e62cce53c34b-etc-machine-id\") pod \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\" (UID: \"e83894ef-6c3b-4edd-ad72-e62cce53c34b\") " Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.522025 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e83894ef-6c3b-4edd-ad72-e62cce53c34b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e83894ef-6c3b-4edd-ad72-e62cce53c34b" (UID: "e83894ef-6c3b-4edd-ad72-e62cce53c34b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.522743 4740 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e83894ef-6c3b-4edd-ad72-e62cce53c34b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.527961 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83894ef-6c3b-4edd-ad72-e62cce53c34b-kube-api-access-z7rnn" (OuterVolumeSpecName: "kube-api-access-z7rnn") pod "e83894ef-6c3b-4edd-ad72-e62cce53c34b" (UID: "e83894ef-6c3b-4edd-ad72-e62cce53c34b"). InnerVolumeSpecName "kube-api-access-z7rnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.528851 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-scripts" (OuterVolumeSpecName: "scripts") pod "e83894ef-6c3b-4edd-ad72-e62cce53c34b" (UID: "e83894ef-6c3b-4edd-ad72-e62cce53c34b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.529226 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e83894ef-6c3b-4edd-ad72-e62cce53c34b" (UID: "e83894ef-6c3b-4edd-ad72-e62cce53c34b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.590583 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e83894ef-6c3b-4edd-ad72-e62cce53c34b" (UID: "e83894ef-6c3b-4edd-ad72-e62cce53c34b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.624303 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.624448 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.624461 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.624470 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7rnn\" (UniqueName: \"kubernetes.io/projected/e83894ef-6c3b-4edd-ad72-e62cce53c34b-kube-api-access-z7rnn\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.650940 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data" (OuterVolumeSpecName: "config-data") pod "e83894ef-6c3b-4edd-ad72-e62cce53c34b" (UID: "e83894ef-6c3b-4edd-ad72-e62cce53c34b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.725630 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83894ef-6c3b-4edd-ad72-e62cce53c34b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.965651 4740 generic.go:334] "Generic (PLEG): container finished" podID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerID="4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7" exitCode=0 Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.966024 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e83894ef-6c3b-4edd-ad72-e62cce53c34b","Type":"ContainerDied","Data":"4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7"} Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.966148 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e83894ef-6c3b-4edd-ad72-e62cce53c34b","Type":"ContainerDied","Data":"d619fddd70cb763c926823faa5273b3a84ef41753450a4c67204cd7cd492541e"} Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.966254 4740 scope.go:117] "RemoveContainer" containerID="3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.966386 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 10:45:49 crc kubenswrapper[4740]: I1009 10:45:49.999022 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.009953 4740 scope.go:117] "RemoveContainer" containerID="4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.026968 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.042345 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:50 crc kubenswrapper[4740]: E1009 10:45:50.042971 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerName="cinder-scheduler" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.043003 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerName="cinder-scheduler" Oct 09 10:45:50 crc kubenswrapper[4740]: E1009 10:45:50.043055 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerName="probe" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.043067 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerName="probe" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.050469 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerName="probe" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.050601 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" containerName="cinder-scheduler" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.054930 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.064445 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.065720 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.082669 4740 scope.go:117] "RemoveContainer" containerID="3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2" Oct 09 10:45:50 crc kubenswrapper[4740]: E1009 10:45:50.083149 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2\": container with ID starting with 3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2 not found: ID does not exist" containerID="3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.083175 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2"} err="failed to get container status \"3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2\": rpc error: code = NotFound desc = could not find container \"3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2\": container with ID starting with 3a83d07e53ba32ae07fd483ed99f1a3e55d0fa350dbdefe2c8dd54b8e8aa42e2 not found: ID does not exist" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.083195 4740 scope.go:117] "RemoveContainer" containerID="4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7" Oct 09 10:45:50 crc kubenswrapper[4740]: E1009 10:45:50.083366 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7\": container with ID starting with 4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7 not found: ID does not exist" containerID="4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.083382 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7"} err="failed to get container status \"4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7\": rpc error: code = NotFound desc = could not find container \"4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7\": container with ID starting with 4faf68671e0fb6b84725f8fdfbfc1be1b2bcaa673bfdad351bc204dac076c3f7 not found: ID does not exist" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.133578 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-scripts\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.133943 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.134008 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.134073 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.134238 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjq26\" (UniqueName: \"kubernetes.io/projected/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-kube-api-access-zjq26\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.134301 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-config-data\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.236229 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjq26\" (UniqueName: \"kubernetes.io/projected/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-kube-api-access-zjq26\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.236574 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-config-data\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.237306 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-scripts\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.237346 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.237375 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.237413 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.237592 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.245362 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.246156 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-config-data\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.246893 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.255253 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-scripts\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.260449 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjq26\" (UniqueName: \"kubernetes.io/projected/f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c-kube-api-access-zjq26\") pod \"cinder-scheduler-0\" (UID: \"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c\") " pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.387701 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.838436 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 10:45:50 crc kubenswrapper[4740]: W1009 10:45:50.839809 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf22418d4_a6c6_4e33_a96e_3e8c2d4a5e1c.slice/crio-ac49024dbdb3069fa200cebc294393c1c9424beab4ce4f902b376726d3259bb2 WatchSource:0}: Error finding container ac49024dbdb3069fa200cebc294393c1c9424beab4ce4f902b376726d3259bb2: Status 404 returned error can't find the container with id ac49024dbdb3069fa200cebc294393c1c9424beab4ce4f902b376726d3259bb2 Oct 09 10:45:50 crc kubenswrapper[4740]: I1009 10:45:50.976305 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c","Type":"ContainerStarted","Data":"ac49024dbdb3069fa200cebc294393c1c9424beab4ce4f902b376726d3259bb2"} Oct 09 10:45:51 crc kubenswrapper[4740]: I1009 10:45:51.210292 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 09 10:45:51 crc kubenswrapper[4740]: I1009 10:45:51.780563 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83894ef-6c3b-4edd-ad72-e62cce53c34b" path="/var/lib/kubelet/pods/e83894ef-6c3b-4edd-ad72-e62cce53c34b/volumes" Oct 09 10:45:51 crc kubenswrapper[4740]: I1009 10:45:51.990645 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c","Type":"ContainerStarted","Data":"cc024f4ea138d98b6677f610137f547a5d565d2807e18c7d4eaa9eadb15d3fd2"} Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.163449 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5db569f5cf-ksc2p" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.392287 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.393851 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.401063 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4nvd9" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.401680 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.413160 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.416382 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.493047 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa49127-ef96-4ed6-8b72-c106e5575707-combined-ca-bundle\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.493371 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/efa49127-ef96-4ed6-8b72-c106e5575707-openstack-config-secret\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.493500 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfqzr\" (UniqueName: \"kubernetes.io/projected/efa49127-ef96-4ed6-8b72-c106e5575707-kube-api-access-xfqzr\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.493646 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/efa49127-ef96-4ed6-8b72-c106e5575707-openstack-config\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.596163 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa49127-ef96-4ed6-8b72-c106e5575707-combined-ca-bundle\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.596540 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/efa49127-ef96-4ed6-8b72-c106e5575707-openstack-config-secret\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.596670 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfqzr\" (UniqueName: \"kubernetes.io/projected/efa49127-ef96-4ed6-8b72-c106e5575707-kube-api-access-xfqzr\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.596844 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/efa49127-ef96-4ed6-8b72-c106e5575707-openstack-config\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.598342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/efa49127-ef96-4ed6-8b72-c106e5575707-openstack-config\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.604019 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa49127-ef96-4ed6-8b72-c106e5575707-combined-ca-bundle\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.610270 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/efa49127-ef96-4ed6-8b72-c106e5575707-openstack-config-secret\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.617401 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfqzr\" (UniqueName: \"kubernetes.io/projected/efa49127-ef96-4ed6-8b72-c106e5575707-kube-api-access-xfqzr\") pod \"openstackclient\" (UID: \"efa49127-ef96-4ed6-8b72-c106e5575707\") " pod="openstack/openstackclient" Oct 09 10:45:52 crc kubenswrapper[4740]: I1009 10:45:52.734183 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 10:45:53 crc kubenswrapper[4740]: I1009 10:45:53.005019 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c","Type":"ContainerStarted","Data":"a9cd643724cb51c953ea1cfc1f0040012d34f655bf5c0287a2e1b12cebc13143"} Oct 09 10:45:53 crc kubenswrapper[4740]: I1009 10:45:53.230804 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.230786227 podStartE2EDuration="4.230786227s" podCreationTimestamp="2025-10-09 10:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:45:53.030711398 +0000 UTC m=+1091.992911779" watchObservedRunningTime="2025-10-09 10:45:53.230786227 +0000 UTC m=+1092.192986618" Oct 09 10:45:53 crc kubenswrapper[4740]: I1009 10:45:53.242162 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 10:45:53 crc kubenswrapper[4740]: I1009 10:45:53.517183 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f67cbf644-2n99k" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 09 10:45:54 crc kubenswrapper[4740]: I1009 10:45:54.023296 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"efa49127-ef96-4ed6-8b72-c106e5575707","Type":"ContainerStarted","Data":"02c1bf5f6ed7f1e7be20cf21ee4575293f79436eb76d01d5329de0066c7b3c54"} Oct 09 10:45:55 crc kubenswrapper[4740]: I1009 10:45:55.388559 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 10:45:55 crc kubenswrapper[4740]: I1009 10:45:55.395952 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:45:55 crc kubenswrapper[4740]: I1009 10:45:55.447891 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-rh279"] Oct 09 10:45:55 crc kubenswrapper[4740]: I1009 10:45:55.449798 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" podUID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" containerName="dnsmasq-dns" containerID="cri-o://3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e" gracePeriod=10 Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.034461 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.043959 4740 generic.go:334] "Generic (PLEG): container finished" podID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" containerID="3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e" exitCode=0 Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.043997 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" event={"ID":"ee8ab1e0-ece4-4528-8609-7f56fb884ae8","Type":"ContainerDied","Data":"3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e"} Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.044021 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" event={"ID":"ee8ab1e0-ece4-4528-8609-7f56fb884ae8","Type":"ContainerDied","Data":"5d8ac3b1fc2ae5fed0cfa69934f96e42324b436669e5c53b52e976d9ee6006c4"} Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.044036 4740 scope.go:117] "RemoveContainer" containerID="3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.044155 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-rh279" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.072445 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-swift-storage-0\") pod \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.072517 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-svc\") pod \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.072570 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-nb\") pod \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.072600 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whhjf\" (UniqueName: \"kubernetes.io/projected/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-kube-api-access-whhjf\") pod \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.073526 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-sb\") pod \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.073599 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-config\") pod \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\" (UID: \"ee8ab1e0-ece4-4528-8609-7f56fb884ae8\") " Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.078414 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-kube-api-access-whhjf" (OuterVolumeSpecName: "kube-api-access-whhjf") pod "ee8ab1e0-ece4-4528-8609-7f56fb884ae8" (UID: "ee8ab1e0-ece4-4528-8609-7f56fb884ae8"). InnerVolumeSpecName "kube-api-access-whhjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.090330 4740 scope.go:117] "RemoveContainer" containerID="41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.119318 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee8ab1e0-ece4-4528-8609-7f56fb884ae8" (UID: "ee8ab1e0-ece4-4528-8609-7f56fb884ae8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.167915 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-config" (OuterVolumeSpecName: "config") pod "ee8ab1e0-ece4-4528-8609-7f56fb884ae8" (UID: "ee8ab1e0-ece4-4528-8609-7f56fb884ae8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.179224 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.179267 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whhjf\" (UniqueName: \"kubernetes.io/projected/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-kube-api-access-whhjf\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.179312 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.200596 4740 scope.go:117] "RemoveContainer" containerID="3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e" Oct 09 10:45:56 crc kubenswrapper[4740]: E1009 10:45:56.201117 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e\": container with ID starting with 3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e not found: ID does not exist" containerID="3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.201165 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e"} err="failed to get container status \"3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e\": rpc error: code = NotFound desc = could not find container \"3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e\": container with ID starting with 3add4974bea86670e4d0ea33aea09eb81e0ca9356bb1eefb14f9dfff8808001e not found: ID does not exist" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.201193 4740 scope.go:117] "RemoveContainer" containerID="41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4" Oct 09 10:45:56 crc kubenswrapper[4740]: E1009 10:45:56.201654 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4\": container with ID starting with 41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4 not found: ID does not exist" containerID="41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.201701 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4"} err="failed to get container status \"41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4\": rpc error: code = NotFound desc = could not find container \"41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4\": container with ID starting with 41c65550daf286b87ae96c7fdcbb6813950a0359ac01d7552a39d0cefb7ab0a4 not found: ID does not exist" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.216615 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee8ab1e0-ece4-4528-8609-7f56fb884ae8" (UID: "ee8ab1e0-ece4-4528-8609-7f56fb884ae8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.222638 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee8ab1e0-ece4-4528-8609-7f56fb884ae8" (UID: "ee8ab1e0-ece4-4528-8609-7f56fb884ae8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.223682 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee8ab1e0-ece4-4528-8609-7f56fb884ae8" (UID: "ee8ab1e0-ece4-4528-8609-7f56fb884ae8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.284318 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.284353 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.284368 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8ab1e0-ece4-4528-8609-7f56fb884ae8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.403418 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-rh279"] Oct 09 10:45:56 crc kubenswrapper[4740]: I1009 10:45:56.410666 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-rh279"] Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.346095 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.347165 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="ceilometer-central-agent" containerID="cri-o://8fad223b9b5380edb8648535ccd0110e972a07409b1f00e7005dcda2620179b8" gracePeriod=30 Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.347729 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="proxy-httpd" containerID="cri-o://dc54c526fa37681bf9b2356a7bd9ecff678f102e8f84d10425927807b3ae3379" gracePeriod=30 Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.347828 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="sg-core" containerID="cri-o://02d2855788adc8a6b430634bc0dc2ad436c644505ffb8ad4dbfc22e0c9ba8b19" gracePeriod=30 Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.347915 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="ceilometer-notification-agent" containerID="cri-o://1234ce3da3be36738b380320cb596846d7cde22c4335b42f547bae728cd1b611" gracePeriod=30 Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.354793 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": EOF" Oct 09 10:45:57 crc kubenswrapper[4740]: E1009 10:45:57.510522 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeb3ea7a_c4b5_4f0d_b4e6_31c0699bd1b3.slice/crio-dc54c526fa37681bf9b2356a7bd9ecff678f102e8f84d10425927807b3ae3379.scope\": RecentStats: unable to find data in memory cache]" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.743194 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-66569d88ff-tjljh"] Oct 09 10:45:57 crc kubenswrapper[4740]: E1009 10:45:57.744235 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" containerName="init" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.744260 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" containerName="init" Oct 09 10:45:57 crc kubenswrapper[4740]: E1009 10:45:57.744298 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" containerName="dnsmasq-dns" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.744308 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" containerName="dnsmasq-dns" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.744519 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" containerName="dnsmasq-dns" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.745688 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.748388 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.748479 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.748925 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.769003 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8ab1e0-ece4-4528-8609-7f56fb884ae8" path="/var/lib/kubelet/pods/ee8ab1e0-ece4-4528-8609-7f56fb884ae8/volumes" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.778436 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-66569d88ff-tjljh"] Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.805844 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/501b9024-4f9f-41eb-ae73-d9ecb0637363-etc-swift\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.805952 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501b9024-4f9f-41eb-ae73-d9ecb0637363-run-httpd\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.805977 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-internal-tls-certs\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.806006 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501b9024-4f9f-41eb-ae73-d9ecb0637363-log-httpd\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.806031 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-combined-ca-bundle\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.806056 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b95l9\" (UniqueName: \"kubernetes.io/projected/501b9024-4f9f-41eb-ae73-d9ecb0637363-kube-api-access-b95l9\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.806100 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-public-tls-certs\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.806138 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-config-data\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.907593 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501b9024-4f9f-41eb-ae73-d9ecb0637363-run-httpd\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.907640 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-internal-tls-certs\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.907675 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501b9024-4f9f-41eb-ae73-d9ecb0637363-log-httpd\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.907696 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-combined-ca-bundle\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.907712 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b95l9\" (UniqueName: \"kubernetes.io/projected/501b9024-4f9f-41eb-ae73-d9ecb0637363-kube-api-access-b95l9\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.907775 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-public-tls-certs\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.907817 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-config-data\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.907892 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/501b9024-4f9f-41eb-ae73-d9ecb0637363-etc-swift\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.909286 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501b9024-4f9f-41eb-ae73-d9ecb0637363-run-httpd\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.910055 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501b9024-4f9f-41eb-ae73-d9ecb0637363-log-httpd\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.916533 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-config-data\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.921631 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-internal-tls-certs\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.925073 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/501b9024-4f9f-41eb-ae73-d9ecb0637363-etc-swift\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.929246 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-combined-ca-bundle\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.931275 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/501b9024-4f9f-41eb-ae73-d9ecb0637363-public-tls-certs\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:57 crc kubenswrapper[4740]: I1009 10:45:57.933385 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b95l9\" (UniqueName: \"kubernetes.io/projected/501b9024-4f9f-41eb-ae73-d9ecb0637363-kube-api-access-b95l9\") pod \"swift-proxy-66569d88ff-tjljh\" (UID: \"501b9024-4f9f-41eb-ae73-d9ecb0637363\") " pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.065437 4740 generic.go:334] "Generic (PLEG): container finished" podID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerID="dc54c526fa37681bf9b2356a7bd9ecff678f102e8f84d10425927807b3ae3379" exitCode=0 Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.065482 4740 generic.go:334] "Generic (PLEG): container finished" podID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerID="02d2855788adc8a6b430634bc0dc2ad436c644505ffb8ad4dbfc22e0c9ba8b19" exitCode=2 Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.065492 4740 generic.go:334] "Generic (PLEG): container finished" podID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerID="8fad223b9b5380edb8648535ccd0110e972a07409b1f00e7005dcda2620179b8" exitCode=0 Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.065512 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerDied","Data":"dc54c526fa37681bf9b2356a7bd9ecff678f102e8f84d10425927807b3ae3379"} Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.065557 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerDied","Data":"02d2855788adc8a6b430634bc0dc2ad436c644505ffb8ad4dbfc22e0c9ba8b19"} Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.065573 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerDied","Data":"8fad223b9b5380edb8648535ccd0110e972a07409b1f00e7005dcda2620179b8"} Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.067997 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.442846 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-nbhjc"] Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.443905 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nbhjc" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.458028 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nbhjc"] Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.559456 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mx6td"] Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.563654 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mx6td" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.582104 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mx6td"] Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.620715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js4rz\" (UniqueName: \"kubernetes.io/projected/7b317805-3df8-459e-a489-955c34dfb3d7-kube-api-access-js4rz\") pod \"nova-api-db-create-nbhjc\" (UID: \"7b317805-3df8-459e-a489-955c34dfb3d7\") " pod="openstack/nova-api-db-create-nbhjc" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.645613 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kcxlk"] Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.647014 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kcxlk" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.654330 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kcxlk"] Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.725281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2mp\" (UniqueName: \"kubernetes.io/projected/51d8e6e3-8b07-45b3-8967-3eab12dab011-kube-api-access-kr2mp\") pod \"nova-cell0-db-create-mx6td\" (UID: \"51d8e6e3-8b07-45b3-8967-3eab12dab011\") " pod="openstack/nova-cell0-db-create-mx6td" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.725589 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js4rz\" (UniqueName: \"kubernetes.io/projected/7b317805-3df8-459e-a489-955c34dfb3d7-kube-api-access-js4rz\") pod \"nova-api-db-create-nbhjc\" (UID: \"7b317805-3df8-459e-a489-955c34dfb3d7\") " pod="openstack/nova-api-db-create-nbhjc" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.751347 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js4rz\" (UniqueName: \"kubernetes.io/projected/7b317805-3df8-459e-a489-955c34dfb3d7-kube-api-access-js4rz\") pod \"nova-api-db-create-nbhjc\" (UID: \"7b317805-3df8-459e-a489-955c34dfb3d7\") " pod="openstack/nova-api-db-create-nbhjc" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.772904 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nbhjc" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.832957 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4lmd\" (UniqueName: \"kubernetes.io/projected/d1437726-4284-4da9-a89e-d68ac67b5546-kube-api-access-z4lmd\") pod \"nova-cell1-db-create-kcxlk\" (UID: \"d1437726-4284-4da9-a89e-d68ac67b5546\") " pod="openstack/nova-cell1-db-create-kcxlk" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.833073 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2mp\" (UniqueName: \"kubernetes.io/projected/51d8e6e3-8b07-45b3-8967-3eab12dab011-kube-api-access-kr2mp\") pod \"nova-cell0-db-create-mx6td\" (UID: \"51d8e6e3-8b07-45b3-8967-3eab12dab011\") " pod="openstack/nova-cell0-db-create-mx6td" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.851246 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2mp\" (UniqueName: \"kubernetes.io/projected/51d8e6e3-8b07-45b3-8967-3eab12dab011-kube-api-access-kr2mp\") pod \"nova-cell0-db-create-mx6td\" (UID: \"51d8e6e3-8b07-45b3-8967-3eab12dab011\") " pod="openstack/nova-cell0-db-create-mx6td" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.892731 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mx6td" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.935155 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4lmd\" (UniqueName: \"kubernetes.io/projected/d1437726-4284-4da9-a89e-d68ac67b5546-kube-api-access-z4lmd\") pod \"nova-cell1-db-create-kcxlk\" (UID: \"d1437726-4284-4da9-a89e-d68ac67b5546\") " pod="openstack/nova-cell1-db-create-kcxlk" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.960098 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4lmd\" (UniqueName: \"kubernetes.io/projected/d1437726-4284-4da9-a89e-d68ac67b5546-kube-api-access-z4lmd\") pod \"nova-cell1-db-create-kcxlk\" (UID: \"d1437726-4284-4da9-a89e-d68ac67b5546\") " pod="openstack/nova-cell1-db-create-kcxlk" Oct 09 10:45:58 crc kubenswrapper[4740]: I1009 10:45:58.971303 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kcxlk" Oct 09 10:46:00 crc kubenswrapper[4740]: I1009 10:46:00.088157 4740 generic.go:334] "Generic (PLEG): container finished" podID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerID="1234ce3da3be36738b380320cb596846d7cde22c4335b42f547bae728cd1b611" exitCode=0 Oct 09 10:46:00 crc kubenswrapper[4740]: I1009 10:46:00.088199 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerDied","Data":"1234ce3da3be36738b380320cb596846d7cde22c4335b42f547bae728cd1b611"} Oct 09 10:46:00 crc kubenswrapper[4740]: I1009 10:46:00.655655 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 10:46:03 crc kubenswrapper[4740]: I1009 10:46:03.516568 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f67cbf644-2n99k" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 09 10:46:03 crc kubenswrapper[4740]: I1009 10:46:03.516696 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.505810 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.558643 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-run-httpd\") pod \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.558779 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-config-data\") pod \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.558811 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-scripts\") pod \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.558995 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-combined-ca-bundle\") pod \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.559028 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-sg-core-conf-yaml\") pod \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.559085 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g22ht\" (UniqueName: \"kubernetes.io/projected/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-kube-api-access-g22ht\") pod \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.559114 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-log-httpd\") pod \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.559467 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" (UID: "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.559647 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.559851 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" (UID: "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.563880 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-scripts" (OuterVolumeSpecName: "scripts") pod "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" (UID: "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.567257 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-kube-api-access-g22ht" (OuterVolumeSpecName: "kube-api-access-g22ht") pod "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" (UID: "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3"). InnerVolumeSpecName "kube-api-access-g22ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.598467 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" (UID: "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.644568 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" (UID: "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660173 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-config-data" (OuterVolumeSpecName: "config-data") pod "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" (UID: "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660267 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-config-data\") pod \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\" (UID: \"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3\") " Oct 09 10:46:04 crc kubenswrapper[4740]: W1009 10:46:04.660644 4740 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3/volumes/kubernetes.io~secret/config-data Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660683 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-config-data" (OuterVolumeSpecName: "config-data") pod "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" (UID: "eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660934 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660956 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660965 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g22ht\" (UniqueName: \"kubernetes.io/projected/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-kube-api-access-g22ht\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660978 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660986 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.660994 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.788481 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nbhjc"] Oct 09 10:46:04 crc kubenswrapper[4740]: W1009 10:46:04.794652 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b317805_3df8_459e_a489_955c34dfb3d7.slice/crio-b24687de1b3a6d1255d1d89ea6cae3fb55890b172753bdf5d8636d807e39e641 WatchSource:0}: Error finding container b24687de1b3a6d1255d1d89ea6cae3fb55890b172753bdf5d8636d807e39e641: Status 404 returned error can't find the container with id b24687de1b3a6d1255d1d89ea6cae3fb55890b172753bdf5d8636d807e39e641 Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.909328 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kcxlk"] Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.919945 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mx6td"] Oct 09 10:46:04 crc kubenswrapper[4740]: I1009 10:46:04.982793 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-66569d88ff-tjljh"] Oct 09 10:46:04 crc kubenswrapper[4740]: W1009 10:46:04.994915 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod501b9024_4f9f_41eb_ae73_d9ecb0637363.slice/crio-286be6e323f0b4d776c4729d44e0e0934e3b6f35abd54ebc8c36dd681bd98b96 WatchSource:0}: Error finding container 286be6e323f0b4d776c4729d44e0e0934e3b6f35abd54ebc8c36dd681bd98b96: Status 404 returned error can't find the container with id 286be6e323f0b4d776c4729d44e0e0934e3b6f35abd54ebc8c36dd681bd98b96 Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.143023 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66569d88ff-tjljh" event={"ID":"501b9024-4f9f-41eb-ae73-d9ecb0637363","Type":"ContainerStarted","Data":"286be6e323f0b4d776c4729d44e0e0934e3b6f35abd54ebc8c36dd681bd98b96"} Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.145446 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"efa49127-ef96-4ed6-8b72-c106e5575707","Type":"ContainerStarted","Data":"b52a7026686bab40fef55bb4bd9d74a783e22dccdf32a012dc11f9e117428705"} Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.153879 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.154924 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3","Type":"ContainerDied","Data":"173d24bdf4169915afd0d52529d7e0f65d43a6610fc8d03619444cafa3ee7388"} Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.155019 4740 scope.go:117] "RemoveContainer" containerID="dc54c526fa37681bf9b2356a7bd9ecff678f102e8f84d10425927807b3ae3379" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.157295 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mx6td" event={"ID":"51d8e6e3-8b07-45b3-8967-3eab12dab011","Type":"ContainerStarted","Data":"234e5dea477fc24a131ed83929ce4667f533ff8c358f5b80314afddc8f36077c"} Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.164146 4740 generic.go:334] "Generic (PLEG): container finished" podID="7b317805-3df8-459e-a489-955c34dfb3d7" containerID="a9428d4dec3d0dd3758cf0ec40f2ff733ad0913f706f5f38a629023ae5b65b63" exitCode=0 Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.164453 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nbhjc" event={"ID":"7b317805-3df8-459e-a489-955c34dfb3d7","Type":"ContainerDied","Data":"a9428d4dec3d0dd3758cf0ec40f2ff733ad0913f706f5f38a629023ae5b65b63"} Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.165003 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nbhjc" event={"ID":"7b317805-3df8-459e-a489-955c34dfb3d7","Type":"ContainerStarted","Data":"b24687de1b3a6d1255d1d89ea6cae3fb55890b172753bdf5d8636d807e39e641"} Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.172877 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kcxlk" event={"ID":"d1437726-4284-4da9-a89e-d68ac67b5546","Type":"ContainerStarted","Data":"4b2dffbb6ac734e76b4e18a760104669ebd7c2c3141819ef59a5116f09a77f27"} Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.172942 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kcxlk" event={"ID":"d1437726-4284-4da9-a89e-d68ac67b5546","Type":"ContainerStarted","Data":"2eb212df4d1c79451efb70539824e322cc74f0fec5c4f3d772fffadbd726d2ae"} Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.176744 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.145208127 podStartE2EDuration="13.17671937s" podCreationTimestamp="2025-10-09 10:45:52 +0000 UTC" firstStartedPulling="2025-10-09 10:45:53.251248272 +0000 UTC m=+1092.213448663" lastFinishedPulling="2025-10-09 10:46:04.282759525 +0000 UTC m=+1103.244959906" observedRunningTime="2025-10-09 10:46:05.166946585 +0000 UTC m=+1104.129146976" watchObservedRunningTime="2025-10-09 10:46:05.17671937 +0000 UTC m=+1104.138919751" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.286236 4740 scope.go:117] "RemoveContainer" containerID="02d2855788adc8a6b430634bc0dc2ad436c644505ffb8ad4dbfc22e0c9ba8b19" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.298417 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.314803 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.331630 4740 scope.go:117] "RemoveContainer" containerID="1234ce3da3be36738b380320cb596846d7cde22c4335b42f547bae728cd1b611" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.338881 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:05 crc kubenswrapper[4740]: E1009 10:46:05.339362 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="ceilometer-notification-agent" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.339381 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="ceilometer-notification-agent" Oct 09 10:46:05 crc kubenswrapper[4740]: E1009 10:46:05.339396 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="ceilometer-central-agent" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.339403 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="ceilometer-central-agent" Oct 09 10:46:05 crc kubenswrapper[4740]: E1009 10:46:05.339424 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="proxy-httpd" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.339429 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="proxy-httpd" Oct 09 10:46:05 crc kubenswrapper[4740]: E1009 10:46:05.339443 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="sg-core" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.339448 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="sg-core" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.339601 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="ceilometer-central-agent" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.339616 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="proxy-httpd" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.339624 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="sg-core" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.339632 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="ceilometer-notification-agent" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.347501 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.350179 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.372699 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.398568 4740 scope.go:117] "RemoveContainer" containerID="8fad223b9b5380edb8648535ccd0110e972a07409b1f00e7005dcda2620179b8" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.401772 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.408804 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.408836 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.480660 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.480696 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-log-httpd\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.480738 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-config-data\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.480766 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-scripts\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.481375 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-run-httpd\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.481411 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.481553 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5dr5\" (UniqueName: \"kubernetes.io/projected/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-kube-api-access-q5dr5\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.582742 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-config-data\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.582816 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-scripts\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.582838 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-run-httpd\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.582862 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.582889 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5dr5\" (UniqueName: \"kubernetes.io/projected/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-kube-api-access-q5dr5\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.582978 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.582991 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-log-httpd\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.583702 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-log-httpd\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.585025 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-run-httpd\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.586538 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.586693 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.587129 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-config-data\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.589200 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-scripts\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.599997 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5dr5\" (UniqueName: \"kubernetes.io/projected/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-kube-api-access-q5dr5\") pod \"ceilometer-0\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.668511 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:05 crc kubenswrapper[4740]: I1009 10:46:05.767960 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" path="/var/lib/kubelet/pods/eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3/volumes" Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.169164 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.187903 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerStarted","Data":"7104c871140ec85d070420d62239ad3355bd185e248b7ab63ef4425fe5f7feb0"} Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.192893 4740 generic.go:334] "Generic (PLEG): container finished" podID="51d8e6e3-8b07-45b3-8967-3eab12dab011" containerID="ca08e689127b608fcdc76804ee25ac188b19c9ba210d1536ab9fb62b1ddbde5a" exitCode=0 Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.192974 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mx6td" event={"ID":"51d8e6e3-8b07-45b3-8967-3eab12dab011","Type":"ContainerDied","Data":"ca08e689127b608fcdc76804ee25ac188b19c9ba210d1536ab9fb62b1ddbde5a"} Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.195890 4740 generic.go:334] "Generic (PLEG): container finished" podID="d1437726-4284-4da9-a89e-d68ac67b5546" containerID="4b2dffbb6ac734e76b4e18a760104669ebd7c2c3141819ef59a5116f09a77f27" exitCode=0 Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.195928 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kcxlk" event={"ID":"d1437726-4284-4da9-a89e-d68ac67b5546","Type":"ContainerDied","Data":"4b2dffbb6ac734e76b4e18a760104669ebd7c2c3141819ef59a5116f09a77f27"} Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.198150 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66569d88ff-tjljh" event={"ID":"501b9024-4f9f-41eb-ae73-d9ecb0637363","Type":"ContainerStarted","Data":"fbb0d576886eee88f096e6cbc3c0150c94fa5560f03840debd9f3f65b84d31ef"} Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.198182 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66569d88ff-tjljh" event={"ID":"501b9024-4f9f-41eb-ae73-d9ecb0637363","Type":"ContainerStarted","Data":"da67598c8ed6724ac625b6bdf95295a309761b3f02f894973d92ac808e958f60"} Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.198334 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.198379 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.703230 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kcxlk" Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.735094 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-66569d88ff-tjljh" podStartSLOduration=9.735074061 podStartE2EDuration="9.735074061s" podCreationTimestamp="2025-10-09 10:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:46:06.230418612 +0000 UTC m=+1105.192618993" watchObservedRunningTime="2025-10-09 10:46:06.735074061 +0000 UTC m=+1105.697274442" Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.800580 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nbhjc" Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.819384 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4lmd\" (UniqueName: \"kubernetes.io/projected/d1437726-4284-4da9-a89e-d68ac67b5546-kube-api-access-z4lmd\") pod \"d1437726-4284-4da9-a89e-d68ac67b5546\" (UID: \"d1437726-4284-4da9-a89e-d68ac67b5546\") " Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.826398 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1437726-4284-4da9-a89e-d68ac67b5546-kube-api-access-z4lmd" (OuterVolumeSpecName: "kube-api-access-z4lmd") pod "d1437726-4284-4da9-a89e-d68ac67b5546" (UID: "d1437726-4284-4da9-a89e-d68ac67b5546"). InnerVolumeSpecName "kube-api-access-z4lmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.924726 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js4rz\" (UniqueName: \"kubernetes.io/projected/7b317805-3df8-459e-a489-955c34dfb3d7-kube-api-access-js4rz\") pod \"7b317805-3df8-459e-a489-955c34dfb3d7\" (UID: \"7b317805-3df8-459e-a489-955c34dfb3d7\") " Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.925744 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4lmd\" (UniqueName: \"kubernetes.io/projected/d1437726-4284-4da9-a89e-d68ac67b5546-kube-api-access-z4lmd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:06 crc kubenswrapper[4740]: I1009 10:46:06.930642 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b317805-3df8-459e-a489-955c34dfb3d7-kube-api-access-js4rz" (OuterVolumeSpecName: "kube-api-access-js4rz") pod "7b317805-3df8-459e-a489-955c34dfb3d7" (UID: "7b317805-3df8-459e-a489-955c34dfb3d7"). InnerVolumeSpecName "kube-api-access-js4rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.027362 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js4rz\" (UniqueName: \"kubernetes.io/projected/7b317805-3df8-459e-a489-955c34dfb3d7-kube-api-access-js4rz\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.208620 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nbhjc" event={"ID":"7b317805-3df8-459e-a489-955c34dfb3d7","Type":"ContainerDied","Data":"b24687de1b3a6d1255d1d89ea6cae3fb55890b172753bdf5d8636d807e39e641"} Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.208659 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b24687de1b3a6d1255d1d89ea6cae3fb55890b172753bdf5d8636d807e39e641" Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.208637 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nbhjc" Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.209985 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kcxlk" Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.209986 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kcxlk" event={"ID":"d1437726-4284-4da9-a89e-d68ac67b5546","Type":"ContainerDied","Data":"2eb212df4d1c79451efb70539824e322cc74f0fec5c4f3d772fffadbd726d2ae"} Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.210101 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb212df4d1c79451efb70539824e322cc74f0fec5c4f3d772fffadbd726d2ae" Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.211390 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerStarted","Data":"9299f93627ca20b477b422110ef8e4883574d48c25720d55fed3ee878f384f65"} Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.528372 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mx6td" Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.637692 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr2mp\" (UniqueName: \"kubernetes.io/projected/51d8e6e3-8b07-45b3-8967-3eab12dab011-kube-api-access-kr2mp\") pod \"51d8e6e3-8b07-45b3-8967-3eab12dab011\" (UID: \"51d8e6e3-8b07-45b3-8967-3eab12dab011\") " Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.644660 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d8e6e3-8b07-45b3-8967-3eab12dab011-kube-api-access-kr2mp" (OuterVolumeSpecName: "kube-api-access-kr2mp") pod "51d8e6e3-8b07-45b3-8967-3eab12dab011" (UID: "51d8e6e3-8b07-45b3-8967-3eab12dab011"). InnerVolumeSpecName "kube-api-access-kr2mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:07 crc kubenswrapper[4740]: I1009 10:46:07.740643 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr2mp\" (UniqueName: \"kubernetes.io/projected/51d8e6e3-8b07-45b3-8967-3eab12dab011-kube-api-access-kr2mp\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:08 crc kubenswrapper[4740]: I1009 10:46:08.220620 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerStarted","Data":"4d96315d59d2bcc76bfaf4d91623338c5f242474f9f5bf3a5ccb366512224823"} Oct 09 10:46:08 crc kubenswrapper[4740]: I1009 10:46:08.222143 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mx6td" event={"ID":"51d8e6e3-8b07-45b3-8967-3eab12dab011","Type":"ContainerDied","Data":"234e5dea477fc24a131ed83929ce4667f533ff8c358f5b80314afddc8f36077c"} Oct 09 10:46:08 crc kubenswrapper[4740]: I1009 10:46:08.222247 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234e5dea477fc24a131ed83929ce4667f533ff8c358f5b80314afddc8f36077c" Oct 09 10:46:08 crc kubenswrapper[4740]: I1009 10:46:08.222191 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mx6td" Oct 09 10:46:08 crc kubenswrapper[4740]: I1009 10:46:08.477625 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:09 crc kubenswrapper[4740]: I1009 10:46:09.235342 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerStarted","Data":"815a9159b9fe57399423c01c6308496c5454fdc7ddb62c391466d9a2dae6e138"} Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.221510 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.245632 4740 generic.go:334] "Generic (PLEG): container finished" podID="5d46647a-6230-4561-bd21-a433ed55dad2" containerID="f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a" exitCode=137 Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.245673 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f67cbf644-2n99k" event={"ID":"5d46647a-6230-4561-bd21-a433ed55dad2","Type":"ContainerDied","Data":"f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a"} Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.245705 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f67cbf644-2n99k" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.245727 4740 scope.go:117] "RemoveContainer" containerID="bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.245716 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f67cbf644-2n99k" event={"ID":"5d46647a-6230-4561-bd21-a433ed55dad2","Type":"ContainerDied","Data":"fee3d95f7ac812514644349f20350b4d2e83230b04c919591291a3ab9d1cd1cd"} Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.293127 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-combined-ca-bundle\") pod \"5d46647a-6230-4561-bd21-a433ed55dad2\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.293217 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-secret-key\") pod \"5d46647a-6230-4561-bd21-a433ed55dad2\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.293293 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-config-data\") pod \"5d46647a-6230-4561-bd21-a433ed55dad2\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.293316 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlfrb\" (UniqueName: \"kubernetes.io/projected/5d46647a-6230-4561-bd21-a433ed55dad2-kube-api-access-qlfrb\") pod \"5d46647a-6230-4561-bd21-a433ed55dad2\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.293389 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-scripts\") pod \"5d46647a-6230-4561-bd21-a433ed55dad2\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.293423 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-tls-certs\") pod \"5d46647a-6230-4561-bd21-a433ed55dad2\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.293476 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d46647a-6230-4561-bd21-a433ed55dad2-logs\") pod \"5d46647a-6230-4561-bd21-a433ed55dad2\" (UID: \"5d46647a-6230-4561-bd21-a433ed55dad2\") " Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.294355 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d46647a-6230-4561-bd21-a433ed55dad2-logs" (OuterVolumeSpecName: "logs") pod "5d46647a-6230-4561-bd21-a433ed55dad2" (UID: "5d46647a-6230-4561-bd21-a433ed55dad2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.298985 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d46647a-6230-4561-bd21-a433ed55dad2-kube-api-access-qlfrb" (OuterVolumeSpecName: "kube-api-access-qlfrb") pod "5d46647a-6230-4561-bd21-a433ed55dad2" (UID: "5d46647a-6230-4561-bd21-a433ed55dad2"). InnerVolumeSpecName "kube-api-access-qlfrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.323706 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5d46647a-6230-4561-bd21-a433ed55dad2" (UID: "5d46647a-6230-4561-bd21-a433ed55dad2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.343567 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-config-data" (OuterVolumeSpecName: "config-data") pod "5d46647a-6230-4561-bd21-a433ed55dad2" (UID: "5d46647a-6230-4561-bd21-a433ed55dad2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.357023 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d46647a-6230-4561-bd21-a433ed55dad2" (UID: "5d46647a-6230-4561-bd21-a433ed55dad2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.358040 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-scripts" (OuterVolumeSpecName: "scripts") pod "5d46647a-6230-4561-bd21-a433ed55dad2" (UID: "5d46647a-6230-4561-bd21-a433ed55dad2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.372099 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5d46647a-6230-4561-bd21-a433ed55dad2" (UID: "5d46647a-6230-4561-bd21-a433ed55dad2"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.394591 4740 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.394874 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.394885 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlfrb\" (UniqueName: \"kubernetes.io/projected/5d46647a-6230-4561-bd21-a433ed55dad2-kube-api-access-qlfrb\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.394896 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d46647a-6230-4561-bd21-a433ed55dad2-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.394905 4740 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.394912 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d46647a-6230-4561-bd21-a433ed55dad2-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.394920 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d46647a-6230-4561-bd21-a433ed55dad2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.430856 4740 scope.go:117] "RemoveContainer" containerID="f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.447021 4740 scope.go:117] "RemoveContainer" containerID="bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416" Oct 09 10:46:10 crc kubenswrapper[4740]: E1009 10:46:10.447359 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416\": container with ID starting with bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416 not found: ID does not exist" containerID="bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.447401 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416"} err="failed to get container status \"bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416\": rpc error: code = NotFound desc = could not find container \"bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416\": container with ID starting with bf82d8ce076fba29ba8e3099eed54d6fb6ae0c40849c42e47a11986d72082416 not found: ID does not exist" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.447428 4740 scope.go:117] "RemoveContainer" containerID="f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a" Oct 09 10:46:10 crc kubenswrapper[4740]: E1009 10:46:10.447733 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a\": container with ID starting with f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a not found: ID does not exist" containerID="f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.447756 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a"} err="failed to get container status \"f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a\": rpc error: code = NotFound desc = could not find container \"f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a\": container with ID starting with f484446da81e0b28190c45af24087e227074250e4f8f8dace13601acb254a05a not found: ID does not exist" Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.573582 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f67cbf644-2n99k"] Oct 09 10:46:10 crc kubenswrapper[4740]: I1009 10:46:10.580227 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f67cbf644-2n99k"] Oct 09 10:46:11 crc kubenswrapper[4740]: I1009 10:46:11.261995 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerStarted","Data":"7fcc70f5adfc4815310ca15eeb1a6f28d89d4bc150d0ebaadf377079ce592651"} Oct 09 10:46:11 crc kubenswrapper[4740]: I1009 10:46:11.262419 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 10:46:11 crc kubenswrapper[4740]: I1009 10:46:11.262214 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="proxy-httpd" containerID="cri-o://7fcc70f5adfc4815310ca15eeb1a6f28d89d4bc150d0ebaadf377079ce592651" gracePeriod=30 Oct 09 10:46:11 crc kubenswrapper[4740]: I1009 10:46:11.262239 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="ceilometer-notification-agent" containerID="cri-o://4d96315d59d2bcc76bfaf4d91623338c5f242474f9f5bf3a5ccb366512224823" gracePeriod=30 Oct 09 10:46:11 crc kubenswrapper[4740]: I1009 10:46:11.262296 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="sg-core" containerID="cri-o://815a9159b9fe57399423c01c6308496c5454fdc7ddb62c391466d9a2dae6e138" gracePeriod=30 Oct 09 10:46:11 crc kubenswrapper[4740]: I1009 10:46:11.262176 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="ceilometer-central-agent" containerID="cri-o://9299f93627ca20b477b422110ef8e4883574d48c25720d55fed3ee878f384f65" gracePeriod=30 Oct 09 10:46:11 crc kubenswrapper[4740]: I1009 10:46:11.320447 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.826324453 podStartE2EDuration="6.320429685s" podCreationTimestamp="2025-10-09 10:46:05 +0000 UTC" firstStartedPulling="2025-10-09 10:46:06.171979709 +0000 UTC m=+1105.134180080" lastFinishedPulling="2025-10-09 10:46:10.666084931 +0000 UTC m=+1109.628285312" observedRunningTime="2025-10-09 10:46:11.289130608 +0000 UTC m=+1110.251330999" watchObservedRunningTime="2025-10-09 10:46:11.320429685 +0000 UTC m=+1110.282630056" Oct 09 10:46:11 crc kubenswrapper[4740]: I1009 10:46:11.779980 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" path="/var/lib/kubelet/pods/5d46647a-6230-4561-bd21-a433ed55dad2/volumes" Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.276533 4740 generic.go:334] "Generic (PLEG): container finished" podID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerID="7fcc70f5adfc4815310ca15eeb1a6f28d89d4bc150d0ebaadf377079ce592651" exitCode=0 Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.276870 4740 generic.go:334] "Generic (PLEG): container finished" podID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerID="815a9159b9fe57399423c01c6308496c5454fdc7ddb62c391466d9a2dae6e138" exitCode=2 Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.276882 4740 generic.go:334] "Generic (PLEG): container finished" podID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerID="4d96315d59d2bcc76bfaf4d91623338c5f242474f9f5bf3a5ccb366512224823" exitCode=0 Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.276889 4740 generic.go:334] "Generic (PLEG): container finished" podID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerID="9299f93627ca20b477b422110ef8e4883574d48c25720d55fed3ee878f384f65" exitCode=0 Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.276619 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerDied","Data":"7fcc70f5adfc4815310ca15eeb1a6f28d89d4bc150d0ebaadf377079ce592651"} Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.276929 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerDied","Data":"815a9159b9fe57399423c01c6308496c5454fdc7ddb62c391466d9a2dae6e138"} Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.276939 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerDied","Data":"4d96315d59d2bcc76bfaf4d91623338c5f242474f9f5bf3a5ccb366512224823"} Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.276949 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerDied","Data":"9299f93627ca20b477b422110ef8e4883574d48c25720d55fed3ee878f384f65"} Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.836338 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.941078 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-scripts\") pod \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.941242 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-sg-core-conf-yaml\") pod \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.941294 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-log-httpd\") pod \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.941376 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-config-data\") pod \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.941400 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5dr5\" (UniqueName: \"kubernetes.io/projected/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-kube-api-access-q5dr5\") pod \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.941438 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-run-httpd\") pod \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.941500 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-combined-ca-bundle\") pod \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\" (UID: \"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899\") " Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.942838 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" (UID: "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.943141 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" (UID: "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.947295 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-kube-api-access-q5dr5" (OuterVolumeSpecName: "kube-api-access-q5dr5") pod "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" (UID: "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899"). InnerVolumeSpecName "kube-api-access-q5dr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.947611 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-scripts" (OuterVolumeSpecName: "scripts") pod "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" (UID: "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:12 crc kubenswrapper[4740]: I1009 10:46:12.983430 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" (UID: "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.024036 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" (UID: "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.043965 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5dr5\" (UniqueName: \"kubernetes.io/projected/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-kube-api-access-q5dr5\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.044185 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.044268 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.044349 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.044479 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.044557 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.066987 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-config-data" (OuterVolumeSpecName: "config-data") pod "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" (UID: "5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.078688 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.080424 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-66569d88ff-tjljh" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.145657 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.286890 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899","Type":"ContainerDied","Data":"7104c871140ec85d070420d62239ad3355bd185e248b7ab63ef4425fe5f7feb0"} Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.286946 4740 scope.go:117] "RemoveContainer" containerID="7fcc70f5adfc4815310ca15eeb1a6f28d89d4bc150d0ebaadf377079ce592651" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.287176 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.314911 4740 scope.go:117] "RemoveContainer" containerID="815a9159b9fe57399423c01c6308496c5454fdc7ddb62c391466d9a2dae6e138" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.324518 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.331433 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.338486 4740 scope.go:117] "RemoveContainer" containerID="4d96315d59d2bcc76bfaf4d91623338c5f242474f9f5bf3a5ccb366512224823" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351183 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351529 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d8e6e3-8b07-45b3-8967-3eab12dab011" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351547 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d8e6e3-8b07-45b3-8967-3eab12dab011" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351562 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="sg-core" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351569 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="sg-core" Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351593 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon-log" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351599 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon-log" Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351608 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b317805-3df8-459e-a489-955c34dfb3d7" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351613 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b317805-3df8-459e-a489-955c34dfb3d7" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351627 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="ceilometer-notification-agent" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351632 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="ceilometer-notification-agent" Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351643 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="ceilometer-central-agent" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351648 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="ceilometer-central-agent" Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351668 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1437726-4284-4da9-a89e-d68ac67b5546" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351674 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1437726-4284-4da9-a89e-d68ac67b5546" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351684 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351691 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon" Oct 09 10:46:13 crc kubenswrapper[4740]: E1009 10:46:13.351701 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="proxy-httpd" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351706 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="proxy-httpd" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351879 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1437726-4284-4da9-a89e-d68ac67b5546" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351889 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="proxy-httpd" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351899 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351912 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d8e6e3-8b07-45b3-8967-3eab12dab011" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351919 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="sg-core" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351928 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d46647a-6230-4561-bd21-a433ed55dad2" containerName="horizon-log" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351938 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="ceilometer-central-agent" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351947 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b317805-3df8-459e-a489-955c34dfb3d7" containerName="mariadb-database-create" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.351954 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" containerName="ceilometer-notification-agent" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.353980 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.359186 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.359390 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.363385 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.363855 4740 scope.go:117] "RemoveContainer" containerID="9299f93627ca20b477b422110ef8e4883574d48c25720d55fed3ee878f384f65" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.553307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.553379 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb5sb\" (UniqueName: \"kubernetes.io/projected/d7216db2-398a-4681-84b0-094da77d597f-kube-api-access-tb5sb\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.553416 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-scripts\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.553452 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-config-data\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.553526 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-log-httpd\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.553568 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-run-httpd\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.553588 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.655578 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb5sb\" (UniqueName: \"kubernetes.io/projected/d7216db2-398a-4681-84b0-094da77d597f-kube-api-access-tb5sb\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.655955 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-scripts\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.656158 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-config-data\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.656340 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-log-httpd\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.656459 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-run-httpd\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.656565 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.656686 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.656965 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-run-httpd\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.657127 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-log-httpd\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.660364 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-config-data\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.660937 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-scripts\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.661040 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.671632 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb5sb\" (UniqueName: \"kubernetes.io/projected/d7216db2-398a-4681-84b0-094da77d597f-kube-api-access-tb5sb\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.689251 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.709698 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:13 crc kubenswrapper[4740]: I1009 10:46:13.771950 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899" path="/var/lib/kubelet/pods/5f0a4a49-4dd6-43dd-b72e-cec9bcfb1899/volumes" Oct 09 10:46:14 crc kubenswrapper[4740]: I1009 10:46:14.077176 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:14 crc kubenswrapper[4740]: I1009 10:46:14.295570 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerStarted","Data":"15a54113a82d4b611f422a25a08d368917d507672bfe531664d7e81855745942"} Oct 09 10:46:15 crc kubenswrapper[4740]: I1009 10:46:15.308367 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerStarted","Data":"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31"} Oct 09 10:46:15 crc kubenswrapper[4740]: I1009 10:46:15.554174 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:46:16 crc kubenswrapper[4740]: I1009 10:46:16.344482 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerStarted","Data":"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433"} Oct 09 10:46:16 crc kubenswrapper[4740]: I1009 10:46:16.345083 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerStarted","Data":"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c"} Oct 09 10:46:17 crc kubenswrapper[4740]: I1009 10:46:17.939484 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d74f6589-zvlln" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.007580 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f54d79f5b-tsjf9"] Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.009169 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f54d79f5b-tsjf9" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerName="neutron-httpd" containerID="cri-o://101e0a0b0c0e372ef0da471fe6744819ea5832156da9489ca62d0b38ba0b893a" gracePeriod=30 Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.009332 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f54d79f5b-tsjf9" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerName="neutron-api" containerID="cri-o://e3247a241eda0c21d93b7a8f1fdd98f5d0ab875cb2dec08ca673ec9a6f900664" gracePeriod=30 Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.362156 4740 generic.go:334] "Generic (PLEG): container finished" podID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerID="101e0a0b0c0e372ef0da471fe6744819ea5832156da9489ca62d0b38ba0b893a" exitCode=0 Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.362240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f54d79f5b-tsjf9" event={"ID":"3a555e46-61f3-4a67-9d18-9acf13b859f7","Type":"ContainerDied","Data":"101e0a0b0c0e372ef0da471fe6744819ea5832156da9489ca62d0b38ba0b893a"} Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.365321 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerStarted","Data":"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab"} Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.365746 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.386285 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8688302810000001 podStartE2EDuration="5.386269789s" podCreationTimestamp="2025-10-09 10:46:13 +0000 UTC" firstStartedPulling="2025-10-09 10:46:14.086379157 +0000 UTC m=+1113.048579538" lastFinishedPulling="2025-10-09 10:46:17.603818665 +0000 UTC m=+1116.566019046" observedRunningTime="2025-10-09 10:46:18.382424545 +0000 UTC m=+1117.344624936" watchObservedRunningTime="2025-10-09 10:46:18.386269789 +0000 UTC m=+1117.348470170" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.515686 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-92ee-account-create-jdtxn"] Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.517098 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-92ee-account-create-jdtxn" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.521040 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.525393 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-92ee-account-create-jdtxn"] Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.619836 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c064-account-create-h5v2f"] Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.621177 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c064-account-create-h5v2f" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.628719 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.630206 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c064-account-create-h5v2f"] Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.650780 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6v9\" (UniqueName: \"kubernetes.io/projected/d6d55cea-26c8-45cc-a7fd-c741620a164a-kube-api-access-kx6v9\") pod \"nova-api-92ee-account-create-jdtxn\" (UID: \"d6d55cea-26c8-45cc-a7fd-c741620a164a\") " pod="openstack/nova-api-92ee-account-create-jdtxn" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.752157 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6v9\" (UniqueName: \"kubernetes.io/projected/d6d55cea-26c8-45cc-a7fd-c741620a164a-kube-api-access-kx6v9\") pod \"nova-api-92ee-account-create-jdtxn\" (UID: \"d6d55cea-26c8-45cc-a7fd-c741620a164a\") " pod="openstack/nova-api-92ee-account-create-jdtxn" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.752281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcngk\" (UniqueName: \"kubernetes.io/projected/cb294930-8e11-4d2a-8965-6451b647fb16-kube-api-access-kcngk\") pod \"nova-cell0-c064-account-create-h5v2f\" (UID: \"cb294930-8e11-4d2a-8965-6451b647fb16\") " pod="openstack/nova-cell0-c064-account-create-h5v2f" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.772941 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6v9\" (UniqueName: \"kubernetes.io/projected/d6d55cea-26c8-45cc-a7fd-c741620a164a-kube-api-access-kx6v9\") pod \"nova-api-92ee-account-create-jdtxn\" (UID: \"d6d55cea-26c8-45cc-a7fd-c741620a164a\") " pod="openstack/nova-api-92ee-account-create-jdtxn" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.826147 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3d7e-account-create-g7zgb"] Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.827638 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3d7e-account-create-g7zgb" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.830088 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.837382 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-92ee-account-create-jdtxn" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.852968 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3d7e-account-create-g7zgb"] Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.854020 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcngk\" (UniqueName: \"kubernetes.io/projected/cb294930-8e11-4d2a-8965-6451b647fb16-kube-api-access-kcngk\") pod \"nova-cell0-c064-account-create-h5v2f\" (UID: \"cb294930-8e11-4d2a-8965-6451b647fb16\") " pod="openstack/nova-cell0-c064-account-create-h5v2f" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.889805 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcngk\" (UniqueName: \"kubernetes.io/projected/cb294930-8e11-4d2a-8965-6451b647fb16-kube-api-access-kcngk\") pod \"nova-cell0-c064-account-create-h5v2f\" (UID: \"cb294930-8e11-4d2a-8965-6451b647fb16\") " pod="openstack/nova-cell0-c064-account-create-h5v2f" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.942596 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c064-account-create-h5v2f" Oct 09 10:46:18 crc kubenswrapper[4740]: I1009 10:46:18.956463 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xfj5\" (UniqueName: \"kubernetes.io/projected/57e9966b-77cc-4158-8cb0-703ee3cb30f5-kube-api-access-5xfj5\") pod \"nova-cell1-3d7e-account-create-g7zgb\" (UID: \"57e9966b-77cc-4158-8cb0-703ee3cb30f5\") " pod="openstack/nova-cell1-3d7e-account-create-g7zgb" Oct 09 10:46:19 crc kubenswrapper[4740]: I1009 10:46:19.058254 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xfj5\" (UniqueName: \"kubernetes.io/projected/57e9966b-77cc-4158-8cb0-703ee3cb30f5-kube-api-access-5xfj5\") pod \"nova-cell1-3d7e-account-create-g7zgb\" (UID: \"57e9966b-77cc-4158-8cb0-703ee3cb30f5\") " pod="openstack/nova-cell1-3d7e-account-create-g7zgb" Oct 09 10:46:19 crc kubenswrapper[4740]: I1009 10:46:19.089136 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xfj5\" (UniqueName: \"kubernetes.io/projected/57e9966b-77cc-4158-8cb0-703ee3cb30f5-kube-api-access-5xfj5\") pod \"nova-cell1-3d7e-account-create-g7zgb\" (UID: \"57e9966b-77cc-4158-8cb0-703ee3cb30f5\") " pod="openstack/nova-cell1-3d7e-account-create-g7zgb" Oct 09 10:46:19 crc kubenswrapper[4740]: I1009 10:46:19.154246 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3d7e-account-create-g7zgb" Oct 09 10:46:19 crc kubenswrapper[4740]: I1009 10:46:19.407592 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-92ee-account-create-jdtxn"] Oct 09 10:46:19 crc kubenswrapper[4740]: I1009 10:46:19.558571 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c064-account-create-h5v2f"] Oct 09 10:46:19 crc kubenswrapper[4740]: W1009 10:46:19.563245 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb294930_8e11_4d2a_8965_6451b647fb16.slice/crio-0ed82a2732b6c82c2c5840ce4bbe80222bc324f01f16c39da113d6222082fa2e WatchSource:0}: Error finding container 0ed82a2732b6c82c2c5840ce4bbe80222bc324f01f16c39da113d6222082fa2e: Status 404 returned error can't find the container with id 0ed82a2732b6c82c2c5840ce4bbe80222bc324f01f16c39da113d6222082fa2e Oct 09 10:46:19 crc kubenswrapper[4740]: I1009 10:46:19.667035 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3d7e-account-create-g7zgb"] Oct 09 10:46:19 crc kubenswrapper[4740]: W1009 10:46:19.728399 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57e9966b_77cc_4158_8cb0_703ee3cb30f5.slice/crio-8e68658a7b93e7be1360661a7fb78723ee04d1b4cc48c5364ff0e4766aa6d059 WatchSource:0}: Error finding container 8e68658a7b93e7be1360661a7fb78723ee04d1b4cc48c5364ff0e4766aa6d059: Status 404 returned error can't find the container with id 8e68658a7b93e7be1360661a7fb78723ee04d1b4cc48c5364ff0e4766aa6d059 Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.325526 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.326096 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="09308063-0c8c-4f0a-83f5-779364607b38" containerName="glance-log" containerID="cri-o://a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206" gracePeriod=30 Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.326491 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="09308063-0c8c-4f0a-83f5-779364607b38" containerName="glance-httpd" containerID="cri-o://d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7" gracePeriod=30 Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.388568 4740 generic.go:334] "Generic (PLEG): container finished" podID="57e9966b-77cc-4158-8cb0-703ee3cb30f5" containerID="2fdd68a273eb8a46c593e136d890dc23d53405150f2a6f0cbbfa25cd415f3f83" exitCode=0 Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.388843 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3d7e-account-create-g7zgb" event={"ID":"57e9966b-77cc-4158-8cb0-703ee3cb30f5","Type":"ContainerDied","Data":"2fdd68a273eb8a46c593e136d890dc23d53405150f2a6f0cbbfa25cd415f3f83"} Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.388946 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3d7e-account-create-g7zgb" event={"ID":"57e9966b-77cc-4158-8cb0-703ee3cb30f5","Type":"ContainerStarted","Data":"8e68658a7b93e7be1360661a7fb78723ee04d1b4cc48c5364ff0e4766aa6d059"} Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.390186 4740 generic.go:334] "Generic (PLEG): container finished" podID="cb294930-8e11-4d2a-8965-6451b647fb16" containerID="2f0c772a86b0ec45f94d9955b092d9053a8a3b47a4d6b1957f73a9b0f76ae1d3" exitCode=0 Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.390270 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c064-account-create-h5v2f" event={"ID":"cb294930-8e11-4d2a-8965-6451b647fb16","Type":"ContainerDied","Data":"2f0c772a86b0ec45f94d9955b092d9053a8a3b47a4d6b1957f73a9b0f76ae1d3"} Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.390312 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c064-account-create-h5v2f" event={"ID":"cb294930-8e11-4d2a-8965-6451b647fb16","Type":"ContainerStarted","Data":"0ed82a2732b6c82c2c5840ce4bbe80222bc324f01f16c39da113d6222082fa2e"} Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.395722 4740 generic.go:334] "Generic (PLEG): container finished" podID="d6d55cea-26c8-45cc-a7fd-c741620a164a" containerID="11b89eb5f162c845cf376342103dc4a861f7efe4c7797fe301820df30e6a64fb" exitCode=0 Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.395781 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-92ee-account-create-jdtxn" event={"ID":"d6d55cea-26c8-45cc-a7fd-c741620a164a","Type":"ContainerDied","Data":"11b89eb5f162c845cf376342103dc4a861f7efe4c7797fe301820df30e6a64fb"} Oct 09 10:46:20 crc kubenswrapper[4740]: I1009 10:46:20.395963 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-92ee-account-create-jdtxn" event={"ID":"d6d55cea-26c8-45cc-a7fd-c741620a164a","Type":"ContainerStarted","Data":"fdbaf23e2b391850351c60512fbc8d9581b569226d4b6097e1dda6327f1bf469"} Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.111954 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.112203 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerName="glance-log" containerID="cri-o://437be0aeabdf24344b8a91f29200dc317f18ebb4acb4cf84d5ff25e799b29027" gracePeriod=30 Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.112349 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerName="glance-httpd" containerID="cri-o://7245c76c0a9620f25cb294bedb5646245c05a4c5c47ec0a715e2c39a30474bed" gracePeriod=30 Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.398952 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.399485 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="ceilometer-central-agent" containerID="cri-o://3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31" gracePeriod=30 Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.399514 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="sg-core" containerID="cri-o://806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433" gracePeriod=30 Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.399547 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="proxy-httpd" containerID="cri-o://b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab" gracePeriod=30 Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.399635 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="ceilometer-notification-agent" containerID="cri-o://4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c" gracePeriod=30 Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.421779 4740 generic.go:334] "Generic (PLEG): container finished" podID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerID="437be0aeabdf24344b8a91f29200dc317f18ebb4acb4cf84d5ff25e799b29027" exitCode=143 Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.421864 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31b6faa7-7a5d-47ba-8ee8-08866ee2933e","Type":"ContainerDied","Data":"437be0aeabdf24344b8a91f29200dc317f18ebb4acb4cf84d5ff25e799b29027"} Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.425109 4740 generic.go:334] "Generic (PLEG): container finished" podID="09308063-0c8c-4f0a-83f5-779364607b38" containerID="a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206" exitCode=143 Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.425283 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09308063-0c8c-4f0a-83f5-779364607b38","Type":"ContainerDied","Data":"a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206"} Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.902462 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-92ee-account-create-jdtxn" Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.913678 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3d7e-account-create-g7zgb" Oct 09 10:46:21 crc kubenswrapper[4740]: I1009 10:46:21.927887 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c064-account-create-h5v2f" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.032168 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx6v9\" (UniqueName: \"kubernetes.io/projected/d6d55cea-26c8-45cc-a7fd-c741620a164a-kube-api-access-kx6v9\") pod \"d6d55cea-26c8-45cc-a7fd-c741620a164a\" (UID: \"d6d55cea-26c8-45cc-a7fd-c741620a164a\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.032306 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xfj5\" (UniqueName: \"kubernetes.io/projected/57e9966b-77cc-4158-8cb0-703ee3cb30f5-kube-api-access-5xfj5\") pod \"57e9966b-77cc-4158-8cb0-703ee3cb30f5\" (UID: \"57e9966b-77cc-4158-8cb0-703ee3cb30f5\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.032376 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcngk\" (UniqueName: \"kubernetes.io/projected/cb294930-8e11-4d2a-8965-6451b647fb16-kube-api-access-kcngk\") pod \"cb294930-8e11-4d2a-8965-6451b647fb16\" (UID: \"cb294930-8e11-4d2a-8965-6451b647fb16\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.069075 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb294930-8e11-4d2a-8965-6451b647fb16-kube-api-access-kcngk" (OuterVolumeSpecName: "kube-api-access-kcngk") pod "cb294930-8e11-4d2a-8965-6451b647fb16" (UID: "cb294930-8e11-4d2a-8965-6451b647fb16"). InnerVolumeSpecName "kube-api-access-kcngk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.069138 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d55cea-26c8-45cc-a7fd-c741620a164a-kube-api-access-kx6v9" (OuterVolumeSpecName: "kube-api-access-kx6v9") pod "d6d55cea-26c8-45cc-a7fd-c741620a164a" (UID: "d6d55cea-26c8-45cc-a7fd-c741620a164a"). InnerVolumeSpecName "kube-api-access-kx6v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.069709 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e9966b-77cc-4158-8cb0-703ee3cb30f5-kube-api-access-5xfj5" (OuterVolumeSpecName: "kube-api-access-5xfj5") pod "57e9966b-77cc-4158-8cb0-703ee3cb30f5" (UID: "57e9966b-77cc-4158-8cb0-703ee3cb30f5"). InnerVolumeSpecName "kube-api-access-5xfj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.134104 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcngk\" (UniqueName: \"kubernetes.io/projected/cb294930-8e11-4d2a-8965-6451b647fb16-kube-api-access-kcngk\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.134137 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx6v9\" (UniqueName: \"kubernetes.io/projected/d6d55cea-26c8-45cc-a7fd-c741620a164a-kube-api-access-kx6v9\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.134146 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xfj5\" (UniqueName: \"kubernetes.io/projected/57e9966b-77cc-4158-8cb0-703ee3cb30f5-kube-api-access-5xfj5\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.339909 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434511 4740 generic.go:334] "Generic (PLEG): container finished" podID="d7216db2-398a-4681-84b0-094da77d597f" containerID="b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab" exitCode=0 Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434545 4740 generic.go:334] "Generic (PLEG): container finished" podID="d7216db2-398a-4681-84b0-094da77d597f" containerID="806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433" exitCode=2 Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434557 4740 generic.go:334] "Generic (PLEG): container finished" podID="d7216db2-398a-4681-84b0-094da77d597f" containerID="4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c" exitCode=0 Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434565 4740 generic.go:334] "Generic (PLEG): container finished" podID="d7216db2-398a-4681-84b0-094da77d597f" containerID="3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31" exitCode=0 Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434602 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerDied","Data":"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434628 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerDied","Data":"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434637 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerDied","Data":"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434645 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerDied","Data":"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434653 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7216db2-398a-4681-84b0-094da77d597f","Type":"ContainerDied","Data":"15a54113a82d4b611f422a25a08d368917d507672bfe531664d7e81855745942"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434667 4740 scope.go:117] "RemoveContainer" containerID="b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.434808 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.437798 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-scripts\") pod \"d7216db2-398a-4681-84b0-094da77d597f\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.437836 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-config-data\") pod \"d7216db2-398a-4681-84b0-094da77d597f\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.437928 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-combined-ca-bundle\") pod \"d7216db2-398a-4681-84b0-094da77d597f\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.437958 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-run-httpd\") pod \"d7216db2-398a-4681-84b0-094da77d597f\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.437979 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-sg-core-conf-yaml\") pod \"d7216db2-398a-4681-84b0-094da77d597f\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.438015 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-log-httpd\") pod \"d7216db2-398a-4681-84b0-094da77d597f\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.438061 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb5sb\" (UniqueName: \"kubernetes.io/projected/d7216db2-398a-4681-84b0-094da77d597f-kube-api-access-tb5sb\") pod \"d7216db2-398a-4681-84b0-094da77d597f\" (UID: \"d7216db2-398a-4681-84b0-094da77d597f\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.439326 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3d7e-account-create-g7zgb" event={"ID":"57e9966b-77cc-4158-8cb0-703ee3cb30f5","Type":"ContainerDied","Data":"8e68658a7b93e7be1360661a7fb78723ee04d1b4cc48c5364ff0e4766aa6d059"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.439359 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e68658a7b93e7be1360661a7fb78723ee04d1b4cc48c5364ff0e4766aa6d059" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.439411 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3d7e-account-create-g7zgb" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.439640 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d7216db2-398a-4681-84b0-094da77d597f" (UID: "d7216db2-398a-4681-84b0-094da77d597f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.439668 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d7216db2-398a-4681-84b0-094da77d597f" (UID: "d7216db2-398a-4681-84b0-094da77d597f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.443427 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c064-account-create-h5v2f" event={"ID":"cb294930-8e11-4d2a-8965-6451b647fb16","Type":"ContainerDied","Data":"0ed82a2732b6c82c2c5840ce4bbe80222bc324f01f16c39da113d6222082fa2e"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.443467 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed82a2732b6c82c2c5840ce4bbe80222bc324f01f16c39da113d6222082fa2e" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.443535 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c064-account-create-h5v2f" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.444661 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7216db2-398a-4681-84b0-094da77d597f-kube-api-access-tb5sb" (OuterVolumeSpecName: "kube-api-access-tb5sb") pod "d7216db2-398a-4681-84b0-094da77d597f" (UID: "d7216db2-398a-4681-84b0-094da77d597f"). InnerVolumeSpecName "kube-api-access-tb5sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.448917 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-scripts" (OuterVolumeSpecName: "scripts") pod "d7216db2-398a-4681-84b0-094da77d597f" (UID: "d7216db2-398a-4681-84b0-094da77d597f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.450411 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-92ee-account-create-jdtxn" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.450882 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-92ee-account-create-jdtxn" event={"ID":"d6d55cea-26c8-45cc-a7fd-c741620a164a","Type":"ContainerDied","Data":"fdbaf23e2b391850351c60512fbc8d9581b569226d4b6097e1dda6327f1bf469"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.450921 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdbaf23e2b391850351c60512fbc8d9581b569226d4b6097e1dda6327f1bf469" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.453463 4740 generic.go:334] "Generic (PLEG): container finished" podID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerID="e3247a241eda0c21d93b7a8f1fdd98f5d0ab875cb2dec08ca673ec9a6f900664" exitCode=0 Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.453490 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f54d79f5b-tsjf9" event={"ID":"3a555e46-61f3-4a67-9d18-9acf13b859f7","Type":"ContainerDied","Data":"e3247a241eda0c21d93b7a8f1fdd98f5d0ab875cb2dec08ca673ec9a6f900664"} Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.472290 4740 scope.go:117] "RemoveContainer" containerID="806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.490304 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d7216db2-398a-4681-84b0-094da77d597f" (UID: "d7216db2-398a-4681-84b0-094da77d597f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.541981 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.542016 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.542030 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.542041 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7216db2-398a-4681-84b0-094da77d597f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.542053 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb5sb\" (UniqueName: \"kubernetes.io/projected/d7216db2-398a-4681-84b0-094da77d597f-kube-api-access-tb5sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.566500 4740 scope.go:117] "RemoveContainer" containerID="4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.571835 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7216db2-398a-4681-84b0-094da77d597f" (UID: "d7216db2-398a-4681-84b0-094da77d597f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.579683 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-config-data" (OuterVolumeSpecName: "config-data") pod "d7216db2-398a-4681-84b0-094da77d597f" (UID: "d7216db2-398a-4681-84b0-094da77d597f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.592309 4740 scope.go:117] "RemoveContainer" containerID="3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.615599 4740 scope.go:117] "RemoveContainer" containerID="b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.617327 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": container with ID starting with b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab not found: ID does not exist" containerID="b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.617368 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab"} err="failed to get container status \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": rpc error: code = NotFound desc = could not find container \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": container with ID starting with b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.617395 4740 scope.go:117] "RemoveContainer" containerID="806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.617864 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": container with ID starting with 806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433 not found: ID does not exist" containerID="806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.617909 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433"} err="failed to get container status \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": rpc error: code = NotFound desc = could not find container \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": container with ID starting with 806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433 not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.617942 4740 scope.go:117] "RemoveContainer" containerID="4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.618320 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": container with ID starting with 4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c not found: ID does not exist" containerID="4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.618349 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c"} err="failed to get container status \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": rpc error: code = NotFound desc = could not find container \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": container with ID starting with 4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.618367 4740 scope.go:117] "RemoveContainer" containerID="3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.618566 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": container with ID starting with 3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31 not found: ID does not exist" containerID="3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.618592 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31"} err="failed to get container status \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": rpc error: code = NotFound desc = could not find container \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": container with ID starting with 3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31 not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.618620 4740 scope.go:117] "RemoveContainer" containerID="b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.619879 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab"} err="failed to get container status \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": rpc error: code = NotFound desc = could not find container \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": container with ID starting with b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.619913 4740 scope.go:117] "RemoveContainer" containerID="806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.620176 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433"} err="failed to get container status \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": rpc error: code = NotFound desc = could not find container \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": container with ID starting with 806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433 not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.620212 4740 scope.go:117] "RemoveContainer" containerID="4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.620444 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c"} err="failed to get container status \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": rpc error: code = NotFound desc = could not find container \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": container with ID starting with 4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.620470 4740 scope.go:117] "RemoveContainer" containerID="3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.620659 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31"} err="failed to get container status \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": rpc error: code = NotFound desc = could not find container \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": container with ID starting with 3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31 not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.620686 4740 scope.go:117] "RemoveContainer" containerID="b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.620923 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab"} err="failed to get container status \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": rpc error: code = NotFound desc = could not find container \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": container with ID starting with b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.620944 4740 scope.go:117] "RemoveContainer" containerID="806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.621160 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433"} err="failed to get container status \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": rpc error: code = NotFound desc = could not find container \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": container with ID starting with 806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433 not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.621183 4740 scope.go:117] "RemoveContainer" containerID="4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.622624 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c"} err="failed to get container status \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": rpc error: code = NotFound desc = could not find container \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": container with ID starting with 4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.622646 4740 scope.go:117] "RemoveContainer" containerID="3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.622892 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31"} err="failed to get container status \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": rpc error: code = NotFound desc = could not find container \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": container with ID starting with 3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31 not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.622912 4740 scope.go:117] "RemoveContainer" containerID="b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.623083 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab"} err="failed to get container status \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": rpc error: code = NotFound desc = could not find container \"b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab\": container with ID starting with b322b37dcb7824c253a449d4aaced1b54e406924989f41769946805df6814cab not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.623105 4740 scope.go:117] "RemoveContainer" containerID="806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.623278 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433"} err="failed to get container status \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": rpc error: code = NotFound desc = could not find container \"806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433\": container with ID starting with 806ac06b838592f7a1ce568ac4ab8d40d2b38969b849b12fd32fc6d84dc56433 not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.623296 4740 scope.go:117] "RemoveContainer" containerID="4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.623455 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c"} err="failed to get container status \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": rpc error: code = NotFound desc = could not find container \"4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c\": container with ID starting with 4d823b658148022ec91631b7a53c214fa68e911122d16db5f7be0ba3187f787c not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.623477 4740 scope.go:117] "RemoveContainer" containerID="3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.623642 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31"} err="failed to get container status \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": rpc error: code = NotFound desc = could not find container \"3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31\": container with ID starting with 3f40f7eee63d5fb17920b886315c01f4f4a311eb52ab4ca1dcd9490696ce6e31 not found: ID does not exist" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.643335 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.643365 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7216db2-398a-4681-84b0-094da77d597f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.727962 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.783630 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.795952 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.847995 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.848362 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-httpd-config\") pod \"3a555e46-61f3-4a67-9d18-9acf13b859f7\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.848486 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-ovndb-tls-certs\") pod \"3a555e46-61f3-4a67-9d18-9acf13b859f7\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.848531 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-config\") pod \"3a555e46-61f3-4a67-9d18-9acf13b859f7\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.848632 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-combined-ca-bundle\") pod \"3a555e46-61f3-4a67-9d18-9acf13b859f7\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.848724 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csnj2\" (UniqueName: \"kubernetes.io/projected/3a555e46-61f3-4a67-9d18-9acf13b859f7-kube-api-access-csnj2\") pod \"3a555e46-61f3-4a67-9d18-9acf13b859f7\" (UID: \"3a555e46-61f3-4a67-9d18-9acf13b859f7\") " Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.848731 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="ceilometer-notification-agent" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.856840 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="ceilometer-notification-agent" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.856869 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e9966b-77cc-4158-8cb0-703ee3cb30f5" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.856877 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e9966b-77cc-4158-8cb0-703ee3cb30f5" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.856910 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerName="neutron-api" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.856919 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerName="neutron-api" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.856949 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="proxy-httpd" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.856956 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="proxy-httpd" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.856967 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerName="neutron-httpd" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.856974 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerName="neutron-httpd" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.856988 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d55cea-26c8-45cc-a7fd-c741620a164a" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.856994 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d55cea-26c8-45cc-a7fd-c741620a164a" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.857021 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="sg-core" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857029 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="sg-core" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.857045 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="ceilometer-central-agent" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857051 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="ceilometer-central-agent" Oct 09 10:46:22 crc kubenswrapper[4740]: E1009 10:46:22.857062 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb294930-8e11-4d2a-8965-6451b647fb16" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857069 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb294930-8e11-4d2a-8965-6451b647fb16" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857418 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="ceilometer-notification-agent" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857438 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6d55cea-26c8-45cc-a7fd-c741620a164a" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857454 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e9966b-77cc-4158-8cb0-703ee3cb30f5" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857466 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb294930-8e11-4d2a-8965-6451b647fb16" containerName="mariadb-account-create" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857479 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerName="neutron-api" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857488 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" containerName="neutron-httpd" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857503 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="ceilometer-central-agent" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857513 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="sg-core" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.857530 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7216db2-398a-4681-84b0-094da77d597f" containerName="proxy-httpd" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.861064 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.862727 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.864815 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.870181 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a555e46-61f3-4a67-9d18-9acf13b859f7-kube-api-access-csnj2" (OuterVolumeSpecName: "kube-api-access-csnj2") pod "3a555e46-61f3-4a67-9d18-9acf13b859f7" (UID: "3a555e46-61f3-4a67-9d18-9acf13b859f7"). InnerVolumeSpecName "kube-api-access-csnj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.870265 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.870285 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3a555e46-61f3-4a67-9d18-9acf13b859f7" (UID: "3a555e46-61f3-4a67-9d18-9acf13b859f7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965273 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-log-httpd\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965351 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-config-data\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965451 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965519 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb8zq\" (UniqueName: \"kubernetes.io/projected/aa61d9ff-5491-44c2-8004-908ed41ffad5-kube-api-access-fb8zq\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965584 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965610 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-run-httpd\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965780 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-scripts\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965939 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csnj2\" (UniqueName: \"kubernetes.io/projected/3a555e46-61f3-4a67-9d18-9acf13b859f7-kube-api-access-csnj2\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.965959 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.971012 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-config" (OuterVolumeSpecName: "config") pod "3a555e46-61f3-4a67-9d18-9acf13b859f7" (UID: "3a555e46-61f3-4a67-9d18-9acf13b859f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:22 crc kubenswrapper[4740]: I1009 10:46:22.992840 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a555e46-61f3-4a67-9d18-9acf13b859f7" (UID: "3a555e46-61f3-4a67-9d18-9acf13b859f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.006002 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3a555e46-61f3-4a67-9d18-9acf13b859f7" (UID: "3a555e46-61f3-4a67-9d18-9acf13b859f7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067368 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-log-httpd\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067437 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-config-data\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067507 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067550 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb8zq\" (UniqueName: \"kubernetes.io/projected/aa61d9ff-5491-44c2-8004-908ed41ffad5-kube-api-access-fb8zq\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067608 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067628 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-run-httpd\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067710 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-scripts\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067796 4740 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067813 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.067826 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a555e46-61f3-4a67-9d18-9acf13b859f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.068862 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-run-httpd\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.069069 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-log-httpd\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.071746 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-scripts\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.071902 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.072173 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.073058 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-config-data\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.088128 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb8zq\" (UniqueName: \"kubernetes.io/projected/aa61d9ff-5491-44c2-8004-908ed41ffad5-kube-api-access-fb8zq\") pod \"ceilometer-0\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.190001 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.463634 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f54d79f5b-tsjf9" event={"ID":"3a555e46-61f3-4a67-9d18-9acf13b859f7","Type":"ContainerDied","Data":"540440658e413ab2a730b8b0ac3726c8100b2b239c2af74077fdaa83ac5d389b"} Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.463644 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f54d79f5b-tsjf9" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.464004 4740 scope.go:117] "RemoveContainer" containerID="101e0a0b0c0e372ef0da471fe6744819ea5832156da9489ca62d0b38ba0b893a" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.494680 4740 scope.go:117] "RemoveContainer" containerID="e3247a241eda0c21d93b7a8f1fdd98f5d0ab875cb2dec08ca673ec9a6f900664" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.523584 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f54d79f5b-tsjf9"] Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.530083 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5f54d79f5b-tsjf9"] Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.662379 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.763718 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a555e46-61f3-4a67-9d18-9acf13b859f7" path="/var/lib/kubelet/pods/3a555e46-61f3-4a67-9d18-9acf13b859f7/volumes" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.764883 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7216db2-398a-4681-84b0-094da77d597f" path="/var/lib/kubelet/pods/d7216db2-398a-4681-84b0-094da77d597f/volumes" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.856383 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gwq7v"] Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.857470 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.863923 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.864131 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qd29k" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.864272 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.870899 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gwq7v"] Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.986429 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-scripts\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.986472 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qfvb\" (UniqueName: \"kubernetes.io/projected/a69aabb5-1a08-483b-b60b-65080c36912c-kube-api-access-5qfvb\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.986813 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.986984 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-config-data\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:23 crc kubenswrapper[4740]: I1009 10:46:23.998591 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.088852 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rshk\" (UniqueName: \"kubernetes.io/projected/09308063-0c8c-4f0a-83f5-779364607b38-kube-api-access-8rshk\") pod \"09308063-0c8c-4f0a-83f5-779364607b38\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.088998 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-public-tls-certs\") pod \"09308063-0c8c-4f0a-83f5-779364607b38\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089030 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-logs\") pod \"09308063-0c8c-4f0a-83f5-779364607b38\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089113 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"09308063-0c8c-4f0a-83f5-779364607b38\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089140 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-config-data\") pod \"09308063-0c8c-4f0a-83f5-779364607b38\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089182 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-combined-ca-bundle\") pod \"09308063-0c8c-4f0a-83f5-779364607b38\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089221 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-httpd-run\") pod \"09308063-0c8c-4f0a-83f5-779364607b38\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089262 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-scripts\") pod \"09308063-0c8c-4f0a-83f5-779364607b38\" (UID: \"09308063-0c8c-4f0a-83f5-779364607b38\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089582 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089662 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-config-data\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089801 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qfvb\" (UniqueName: \"kubernetes.io/projected/a69aabb5-1a08-483b-b60b-65080c36912c-kube-api-access-5qfvb\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.089831 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-scripts\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.092248 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "09308063-0c8c-4f0a-83f5-779364607b38" (UID: "09308063-0c8c-4f0a-83f5-779364607b38"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.092273 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-logs" (OuterVolumeSpecName: "logs") pod "09308063-0c8c-4f0a-83f5-779364607b38" (UID: "09308063-0c8c-4f0a-83f5-779364607b38"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.098908 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "09308063-0c8c-4f0a-83f5-779364607b38" (UID: "09308063-0c8c-4f0a-83f5-779364607b38"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.099731 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09308063-0c8c-4f0a-83f5-779364607b38-kube-api-access-8rshk" (OuterVolumeSpecName: "kube-api-access-8rshk") pod "09308063-0c8c-4f0a-83f5-779364607b38" (UID: "09308063-0c8c-4f0a-83f5-779364607b38"). InnerVolumeSpecName "kube-api-access-8rshk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.127290 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-scripts" (OuterVolumeSpecName: "scripts") pod "09308063-0c8c-4f0a-83f5-779364607b38" (UID: "09308063-0c8c-4f0a-83f5-779364607b38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.127571 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.127879 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-config-data\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.128181 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-scripts\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.139792 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qfvb\" (UniqueName: \"kubernetes.io/projected/a69aabb5-1a08-483b-b60b-65080c36912c-kube-api-access-5qfvb\") pod \"nova-cell0-conductor-db-sync-gwq7v\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.150875 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09308063-0c8c-4f0a-83f5-779364607b38" (UID: "09308063-0c8c-4f0a-83f5-779364607b38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.156793 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-config-data" (OuterVolumeSpecName: "config-data") pod "09308063-0c8c-4f0a-83f5-779364607b38" (UID: "09308063-0c8c-4f0a-83f5-779364607b38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.178955 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "09308063-0c8c-4f0a-83f5-779364607b38" (UID: "09308063-0c8c-4f0a-83f5-779364607b38"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.184564 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.191707 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.191732 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.191742 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.191772 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.191780 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.191788 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rshk\" (UniqueName: \"kubernetes.io/projected/09308063-0c8c-4f0a-83f5-779364607b38-kube-api-access-8rshk\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.191796 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09308063-0c8c-4f0a-83f5-779364607b38-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.191804 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09308063-0c8c-4f0a-83f5-779364607b38-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.217490 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.294534 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.483429 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerStarted","Data":"ab2b776172fdc4bfef22e26cf1c63a9afd3d83665e8ffe4e5dbf7b9b90735111"} Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.486250 4740 generic.go:334] "Generic (PLEG): container finished" podID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerID="7245c76c0a9620f25cb294bedb5646245c05a4c5c47ec0a715e2c39a30474bed" exitCode=0 Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.486320 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31b6faa7-7a5d-47ba-8ee8-08866ee2933e","Type":"ContainerDied","Data":"7245c76c0a9620f25cb294bedb5646245c05a4c5c47ec0a715e2c39a30474bed"} Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.488884 4740 generic.go:334] "Generic (PLEG): container finished" podID="09308063-0c8c-4f0a-83f5-779364607b38" containerID="d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7" exitCode=0 Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.488950 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.488994 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09308063-0c8c-4f0a-83f5-779364607b38","Type":"ContainerDied","Data":"d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7"} Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.489031 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09308063-0c8c-4f0a-83f5-779364607b38","Type":"ContainerDied","Data":"2e5d27487f0f11d70d7c078105348de1f1755e5696b11f187ac77a8ceabc4ea7"} Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.489053 4740 scope.go:117] "RemoveContainer" containerID="d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.530881 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.540638 4740 scope.go:117] "RemoveContainer" containerID="a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.550178 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.574986 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:46:24 crc kubenswrapper[4740]: E1009 10:46:24.577777 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09308063-0c8c-4f0a-83f5-779364607b38" containerName="glance-httpd" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.577810 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="09308063-0c8c-4f0a-83f5-779364607b38" containerName="glance-httpd" Oct 09 10:46:24 crc kubenswrapper[4740]: E1009 10:46:24.577847 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09308063-0c8c-4f0a-83f5-779364607b38" containerName="glance-log" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.577857 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="09308063-0c8c-4f0a-83f5-779364607b38" containerName="glance-log" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.578874 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="09308063-0c8c-4f0a-83f5-779364607b38" containerName="glance-httpd" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.578926 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="09308063-0c8c-4f0a-83f5-779364607b38" containerName="glance-log" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.583970 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.586282 4740 scope.go:117] "RemoveContainer" containerID="d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7" Oct 09 10:46:24 crc kubenswrapper[4740]: E1009 10:46:24.587396 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7\": container with ID starting with d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7 not found: ID does not exist" containerID="d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.587430 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7"} err="failed to get container status \"d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7\": rpc error: code = NotFound desc = could not find container \"d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7\": container with ID starting with d441530e363950c3e320e9b8c7d8da2221fd27b17faee5645ff5761c6c531dc7 not found: ID does not exist" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.587453 4740 scope.go:117] "RemoveContainer" containerID="a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206" Oct 09 10:46:24 crc kubenswrapper[4740]: E1009 10:46:24.588225 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206\": container with ID starting with a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206 not found: ID does not exist" containerID="a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.588247 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206"} err="failed to get container status \"a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206\": rpc error: code = NotFound desc = could not find container \"a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206\": container with ID starting with a0e94c39a36122188ca1d34ed558936f01edc7c33c4ad865ed2856dd362fb206 not found: ID does not exist" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.589289 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.590742 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.635658 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.707906 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.707961 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.707990 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83ecd586-6121-4e74-91f1-87267432cc2d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.708090 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ecd586-6121-4e74-91f1-87267432cc2d-logs\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.708117 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.708141 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqcgm\" (UniqueName: \"kubernetes.io/projected/83ecd586-6121-4e74-91f1-87267432cc2d-kube-api-access-hqcgm\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.708173 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-config-data\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.708190 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-scripts\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.708632 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gwq7v"] Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.780587 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.812153 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.811747 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.812926 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.812962 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83ecd586-6121-4e74-91f1-87267432cc2d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.813055 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ecd586-6121-4e74-91f1-87267432cc2d-logs\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.813079 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.813100 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqcgm\" (UniqueName: \"kubernetes.io/projected/83ecd586-6121-4e74-91f1-87267432cc2d-kube-api-access-hqcgm\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.813146 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-config-data\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.813165 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-scripts\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.818285 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.819122 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83ecd586-6121-4e74-91f1-87267432cc2d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.827427 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ecd586-6121-4e74-91f1-87267432cc2d-logs\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.832892 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-scripts\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.833292 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.836497 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ecd586-6121-4e74-91f1-87267432cc2d-config-data\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.842607 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqcgm\" (UniqueName: \"kubernetes.io/projected/83ecd586-6121-4e74-91f1-87267432cc2d-kube-api-access-hqcgm\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.860675 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"83ecd586-6121-4e74-91f1-87267432cc2d\") " pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.915187 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-combined-ca-bundle\") pod \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.915270 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-httpd-run\") pod \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.915293 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.915311 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-logs\") pod \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.915343 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-scripts\") pod \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.915378 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-config-data\") pod \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.915434 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvmv9\" (UniqueName: \"kubernetes.io/projected/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-kube-api-access-jvmv9\") pod \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.915466 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-internal-tls-certs\") pod \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\" (UID: \"31b6faa7-7a5d-47ba-8ee8-08866ee2933e\") " Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.916292 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "31b6faa7-7a5d-47ba-8ee8-08866ee2933e" (UID: "31b6faa7-7a5d-47ba-8ee8-08866ee2933e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.916563 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-logs" (OuterVolumeSpecName: "logs") pod "31b6faa7-7a5d-47ba-8ee8-08866ee2933e" (UID: "31b6faa7-7a5d-47ba-8ee8-08866ee2933e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.923109 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-kube-api-access-jvmv9" (OuterVolumeSpecName: "kube-api-access-jvmv9") pod "31b6faa7-7a5d-47ba-8ee8-08866ee2933e" (UID: "31b6faa7-7a5d-47ba-8ee8-08866ee2933e"). InnerVolumeSpecName "kube-api-access-jvmv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.923214 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "31b6faa7-7a5d-47ba-8ee8-08866ee2933e" (UID: "31b6faa7-7a5d-47ba-8ee8-08866ee2933e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.923527 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-scripts" (OuterVolumeSpecName: "scripts") pod "31b6faa7-7a5d-47ba-8ee8-08866ee2933e" (UID: "31b6faa7-7a5d-47ba-8ee8-08866ee2933e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.954079 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31b6faa7-7a5d-47ba-8ee8-08866ee2933e" (UID: "31b6faa7-7a5d-47ba-8ee8-08866ee2933e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.980652 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.988938 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-config-data" (OuterVolumeSpecName: "config-data") pod "31b6faa7-7a5d-47ba-8ee8-08866ee2933e" (UID: "31b6faa7-7a5d-47ba-8ee8-08866ee2933e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:24 crc kubenswrapper[4740]: I1009 10:46:24.994122 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "31b6faa7-7a5d-47ba-8ee8-08866ee2933e" (UID: "31b6faa7-7a5d-47ba-8ee8-08866ee2933e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.017412 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.017466 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.017478 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.017486 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.017495 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.017505 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvmv9\" (UniqueName: \"kubernetes.io/projected/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-kube-api-access-jvmv9\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.017515 4740 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.017523 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b6faa7-7a5d-47ba-8ee8-08866ee2933e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.049727 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.119018 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.513192 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31b6faa7-7a5d-47ba-8ee8-08866ee2933e","Type":"ContainerDied","Data":"c7c90232c3159a0e2aae51084e4ea6c15871fa59f7bc2cd20505cf60711548d2"} Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.513261 4740 scope.go:117] "RemoveContainer" containerID="7245c76c0a9620f25cb294bedb5646245c05a4c5c47ec0a715e2c39a30474bed" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.513434 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.516713 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.523924 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" event={"ID":"a69aabb5-1a08-483b-b60b-65080c36912c","Type":"ContainerStarted","Data":"6b4f6035eb5a4dd2537ae221fe9dd5a8ad4809561297ad89376ea3b7348ffd37"} Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.555744 4740 scope.go:117] "RemoveContainer" containerID="437be0aeabdf24344b8a91f29200dc317f18ebb4acb4cf84d5ff25e799b29027" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.586464 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerStarted","Data":"2b132c11b65372e4673b32ce9d4237a1e6327373c31e880e0cf81029fc9409ec"} Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.586524 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerStarted","Data":"bedf0d9354399e392d76b4f53796dc81959eba8ed2cb0638073ea44de75184db"} Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.636072 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.649489 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.664450 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:46:25 crc kubenswrapper[4740]: E1009 10:46:25.664840 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerName="glance-log" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.664857 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerName="glance-log" Oct 09 10:46:25 crc kubenswrapper[4740]: E1009 10:46:25.664875 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerName="glance-httpd" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.664882 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerName="glance-httpd" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.665048 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerName="glance-log" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.665068 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" containerName="glance-httpd" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.677125 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.679374 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.679545 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.681901 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.787255 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09308063-0c8c-4f0a-83f5-779364607b38" path="/var/lib/kubelet/pods/09308063-0c8c-4f0a-83f5-779364607b38/volumes" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.788661 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31b6faa7-7a5d-47ba-8ee8-08866ee2933e" path="/var/lib/kubelet/pods/31b6faa7-7a5d-47ba-8ee8-08866ee2933e/volumes" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.883696 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.883806 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.883831 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kn9c\" (UniqueName: \"kubernetes.io/projected/5eae963c-dabb-4da9-ac57-86a621088e55-kube-api-access-4kn9c\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.883856 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.883872 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.883956 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.883976 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eae963c-dabb-4da9-ac57-86a621088e55-logs\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.884075 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5eae963c-dabb-4da9-ac57-86a621088e55-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.985590 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5eae963c-dabb-4da9-ac57-86a621088e55-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.985667 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.985718 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.985736 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kn9c\" (UniqueName: \"kubernetes.io/projected/5eae963c-dabb-4da9-ac57-86a621088e55-kube-api-access-4kn9c\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.985777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.985796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.985832 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.985847 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eae963c-dabb-4da9-ac57-86a621088e55-logs\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.986319 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eae963c-dabb-4da9-ac57-86a621088e55-logs\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.986524 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5eae963c-dabb-4da9-ac57-86a621088e55-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.987296 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.993411 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.994455 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:25 crc kubenswrapper[4740]: I1009 10:46:25.995035 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.006413 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eae963c-dabb-4da9-ac57-86a621088e55-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.013251 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kn9c\" (UniqueName: \"kubernetes.io/projected/5eae963c-dabb-4da9-ac57-86a621088e55-kube-api-access-4kn9c\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.031533 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5eae963c-dabb-4da9-ac57-86a621088e55\") " pod="openstack/glance-default-internal-api-0" Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.291030 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.333027 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.612221 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerStarted","Data":"e6cd471c86ef6386725cb57b5153bb6ab7bd99d8c9b5956a29efda7797e31e45"} Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.615282 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"83ecd586-6121-4e74-91f1-87267432cc2d","Type":"ContainerStarted","Data":"a8eaad18d692ee3c3c437874c0cd1526ca0175ddb1b99aaf36f468259cb0be04"} Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.615319 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"83ecd586-6121-4e74-91f1-87267432cc2d","Type":"ContainerStarted","Data":"1212cec21d90c182b162d93e6f40cd9ecbca9eda104ea5a16dc516f367bacfa1"} Oct 09 10:46:26 crc kubenswrapper[4740]: I1009 10:46:26.905647 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 10:46:26 crc kubenswrapper[4740]: W1009 10:46:26.913926 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eae963c_dabb_4da9_ac57_86a621088e55.slice/crio-91e7af9cb56191c8be5203e94a330bbd3ee05efcb6ce525208be9ffe88b800fe WatchSource:0}: Error finding container 91e7af9cb56191c8be5203e94a330bbd3ee05efcb6ce525208be9ffe88b800fe: Status 404 returned error can't find the container with id 91e7af9cb56191c8be5203e94a330bbd3ee05efcb6ce525208be9ffe88b800fe Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.634342 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5eae963c-dabb-4da9-ac57-86a621088e55","Type":"ContainerStarted","Data":"3f57dabe50a535b0e80c7409a987382ed04e170c38230770a409a572e8dcd991"} Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.634802 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5eae963c-dabb-4da9-ac57-86a621088e55","Type":"ContainerStarted","Data":"91e7af9cb56191c8be5203e94a330bbd3ee05efcb6ce525208be9ffe88b800fe"} Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.640418 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerStarted","Data":"b9f12392c01852b8695bbcbcdebee1e2782e0e743118a438463de3f70f083100"} Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.640518 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="ceilometer-central-agent" containerID="cri-o://bedf0d9354399e392d76b4f53796dc81959eba8ed2cb0638073ea44de75184db" gracePeriod=30 Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.640612 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="proxy-httpd" containerID="cri-o://b9f12392c01852b8695bbcbcdebee1e2782e0e743118a438463de3f70f083100" gracePeriod=30 Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.640647 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="sg-core" containerID="cri-o://e6cd471c86ef6386725cb57b5153bb6ab7bd99d8c9b5956a29efda7797e31e45" gracePeriod=30 Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.640677 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="ceilometer-notification-agent" containerID="cri-o://2b132c11b65372e4673b32ce9d4237a1e6327373c31e880e0cf81029fc9409ec" gracePeriod=30 Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.640812 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.653194 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"83ecd586-6121-4e74-91f1-87267432cc2d","Type":"ContainerStarted","Data":"02ecec20cc75ce6426e3280c7bcaae9c2522151c70ceeb1710fa7cc2cceddd7e"} Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.664188 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.143183709 podStartE2EDuration="5.664164983s" podCreationTimestamp="2025-10-09 10:46:22 +0000 UTC" firstStartedPulling="2025-10-09 10:46:23.673374344 +0000 UTC m=+1122.635574725" lastFinishedPulling="2025-10-09 10:46:27.194355618 +0000 UTC m=+1126.156555999" observedRunningTime="2025-10-09 10:46:27.663250659 +0000 UTC m=+1126.625451040" watchObservedRunningTime="2025-10-09 10:46:27.664164983 +0000 UTC m=+1126.626365364" Oct 09 10:46:27 crc kubenswrapper[4740]: I1009 10:46:27.686636 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.686620662 podStartE2EDuration="3.686620662s" podCreationTimestamp="2025-10-09 10:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:46:27.684959117 +0000 UTC m=+1126.647159498" watchObservedRunningTime="2025-10-09 10:46:27.686620662 +0000 UTC m=+1126.648821043" Oct 09 10:46:28 crc kubenswrapper[4740]: I1009 10:46:28.664683 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5eae963c-dabb-4da9-ac57-86a621088e55","Type":"ContainerStarted","Data":"24216a3af983b47c7e623b66c93abd85c289f135638d02e9cbea6d227fb50942"} Oct 09 10:46:28 crc kubenswrapper[4740]: I1009 10:46:28.667316 4740 generic.go:334] "Generic (PLEG): container finished" podID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerID="b9f12392c01852b8695bbcbcdebee1e2782e0e743118a438463de3f70f083100" exitCode=0 Oct 09 10:46:28 crc kubenswrapper[4740]: I1009 10:46:28.667336 4740 generic.go:334] "Generic (PLEG): container finished" podID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerID="e6cd471c86ef6386725cb57b5153bb6ab7bd99d8c9b5956a29efda7797e31e45" exitCode=2 Oct 09 10:46:28 crc kubenswrapper[4740]: I1009 10:46:28.667345 4740 generic.go:334] "Generic (PLEG): container finished" podID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerID="2b132c11b65372e4673b32ce9d4237a1e6327373c31e880e0cf81029fc9409ec" exitCode=0 Oct 09 10:46:28 crc kubenswrapper[4740]: I1009 10:46:28.667388 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerDied","Data":"b9f12392c01852b8695bbcbcdebee1e2782e0e743118a438463de3f70f083100"} Oct 09 10:46:28 crc kubenswrapper[4740]: I1009 10:46:28.667416 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerDied","Data":"e6cd471c86ef6386725cb57b5153bb6ab7bd99d8c9b5956a29efda7797e31e45"} Oct 09 10:46:28 crc kubenswrapper[4740]: I1009 10:46:28.667427 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerDied","Data":"2b132c11b65372e4673b32ce9d4237a1e6327373c31e880e0cf81029fc9409ec"} Oct 09 10:46:28 crc kubenswrapper[4740]: I1009 10:46:28.694679 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.694663686 podStartE2EDuration="3.694663686s" podCreationTimestamp="2025-10-09 10:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:46:28.689049254 +0000 UTC m=+1127.651249625" watchObservedRunningTime="2025-10-09 10:46:28.694663686 +0000 UTC m=+1127.656864067" Oct 09 10:46:31 crc kubenswrapper[4740]: I1009 10:46:31.694709 4740 generic.go:334] "Generic (PLEG): container finished" podID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerID="bedf0d9354399e392d76b4f53796dc81959eba8ed2cb0638073ea44de75184db" exitCode=0 Oct 09 10:46:31 crc kubenswrapper[4740]: I1009 10:46:31.694795 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerDied","Data":"bedf0d9354399e392d76b4f53796dc81959eba8ed2cb0638073ea44de75184db"} Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.127840 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.254165 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-sg-core-conf-yaml\") pod \"aa61d9ff-5491-44c2-8004-908ed41ffad5\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.254413 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-scripts\") pod \"aa61d9ff-5491-44c2-8004-908ed41ffad5\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.254523 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-run-httpd\") pod \"aa61d9ff-5491-44c2-8004-908ed41ffad5\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.254650 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-log-httpd\") pod \"aa61d9ff-5491-44c2-8004-908ed41ffad5\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.254716 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-config-data\") pod \"aa61d9ff-5491-44c2-8004-908ed41ffad5\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.254787 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-combined-ca-bundle\") pod \"aa61d9ff-5491-44c2-8004-908ed41ffad5\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.254822 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb8zq\" (UniqueName: \"kubernetes.io/projected/aa61d9ff-5491-44c2-8004-908ed41ffad5-kube-api-access-fb8zq\") pod \"aa61d9ff-5491-44c2-8004-908ed41ffad5\" (UID: \"aa61d9ff-5491-44c2-8004-908ed41ffad5\") " Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.254896 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aa61d9ff-5491-44c2-8004-908ed41ffad5" (UID: "aa61d9ff-5491-44c2-8004-908ed41ffad5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.255176 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aa61d9ff-5491-44c2-8004-908ed41ffad5" (UID: "aa61d9ff-5491-44c2-8004-908ed41ffad5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.255583 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.255616 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa61d9ff-5491-44c2-8004-908ed41ffad5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.259813 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-scripts" (OuterVolumeSpecName: "scripts") pod "aa61d9ff-5491-44c2-8004-908ed41ffad5" (UID: "aa61d9ff-5491-44c2-8004-908ed41ffad5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.259876 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa61d9ff-5491-44c2-8004-908ed41ffad5-kube-api-access-fb8zq" (OuterVolumeSpecName: "kube-api-access-fb8zq") pod "aa61d9ff-5491-44c2-8004-908ed41ffad5" (UID: "aa61d9ff-5491-44c2-8004-908ed41ffad5"). InnerVolumeSpecName "kube-api-access-fb8zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.289842 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aa61d9ff-5491-44c2-8004-908ed41ffad5" (UID: "aa61d9ff-5491-44c2-8004-908ed41ffad5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.354823 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa61d9ff-5491-44c2-8004-908ed41ffad5" (UID: "aa61d9ff-5491-44c2-8004-908ed41ffad5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.357156 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.357205 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb8zq\" (UniqueName: \"kubernetes.io/projected/aa61d9ff-5491-44c2-8004-908ed41ffad5-kube-api-access-fb8zq\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.357227 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.357244 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.384046 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-config-data" (OuterVolumeSpecName: "config-data") pod "aa61d9ff-5491-44c2-8004-908ed41ffad5" (UID: "aa61d9ff-5491-44c2-8004-908ed41ffad5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.458880 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa61d9ff-5491-44c2-8004-908ed41ffad5-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.725506 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" event={"ID":"a69aabb5-1a08-483b-b60b-65080c36912c","Type":"ContainerStarted","Data":"7423e2ae9cc7d138508f764b745adfcca9c4d4c73313b74e561ecafeef343599"} Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.731814 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa61d9ff-5491-44c2-8004-908ed41ffad5","Type":"ContainerDied","Data":"ab2b776172fdc4bfef22e26cf1c63a9afd3d83665e8ffe4e5dbf7b9b90735111"} Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.731880 4740 scope.go:117] "RemoveContainer" containerID="b9f12392c01852b8695bbcbcdebee1e2782e0e743118a438463de3f70f083100" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.731930 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.751665 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" podStartSLOduration=2.338681222 podStartE2EDuration="10.751644676s" podCreationTimestamp="2025-10-09 10:46:23 +0000 UTC" firstStartedPulling="2025-10-09 10:46:24.709455428 +0000 UTC m=+1123.671655809" lastFinishedPulling="2025-10-09 10:46:33.122418882 +0000 UTC m=+1132.084619263" observedRunningTime="2025-10-09 10:46:33.742136819 +0000 UTC m=+1132.704337210" watchObservedRunningTime="2025-10-09 10:46:33.751644676 +0000 UTC m=+1132.713845067" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.769029 4740 scope.go:117] "RemoveContainer" containerID="e6cd471c86ef6386725cb57b5153bb6ab7bd99d8c9b5956a29efda7797e31e45" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.798623 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.798857 4740 scope.go:117] "RemoveContainer" containerID="2b132c11b65372e4673b32ce9d4237a1e6327373c31e880e0cf81029fc9409ec" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.807633 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.830227 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:33 crc kubenswrapper[4740]: E1009 10:46:33.830712 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="ceilometer-central-agent" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.830738 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="ceilometer-central-agent" Oct 09 10:46:33 crc kubenswrapper[4740]: E1009 10:46:33.830771 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="ceilometer-notification-agent" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.830781 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="ceilometer-notification-agent" Oct 09 10:46:33 crc kubenswrapper[4740]: E1009 10:46:33.830812 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="proxy-httpd" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.830821 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="proxy-httpd" Oct 09 10:46:33 crc kubenswrapper[4740]: E1009 10:46:33.830833 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="sg-core" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.830840 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="sg-core" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.831056 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="ceilometer-notification-agent" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.831082 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="ceilometer-central-agent" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.831095 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="sg-core" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.831116 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" containerName="proxy-httpd" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.832984 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.835495 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.835721 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.840160 4740 scope.go:117] "RemoveContainer" containerID="bedf0d9354399e392d76b4f53796dc81959eba8ed2cb0638073ea44de75184db" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.855976 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.967888 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.967928 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-config-data\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.967960 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klmhb\" (UniqueName: \"kubernetes.io/projected/ed1a4238-024e-420e-9848-cd048fdd24f3-kube-api-access-klmhb\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.968001 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.968025 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-scripts\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.968074 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-run-httpd\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:33 crc kubenswrapper[4740]: I1009 10:46:33.968087 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-log-httpd\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.069904 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klmhb\" (UniqueName: \"kubernetes.io/projected/ed1a4238-024e-420e-9848-cd048fdd24f3-kube-api-access-klmhb\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.070026 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.070081 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-scripts\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.070234 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-run-httpd\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.070267 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-log-httpd\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.070398 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.070441 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-config-data\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.072537 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-run-httpd\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.075253 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-log-httpd\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.075963 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-config-data\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.077722 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.080383 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.087530 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-scripts\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.099379 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klmhb\" (UniqueName: \"kubernetes.io/projected/ed1a4238-024e-420e-9848-cd048fdd24f3-kube-api-access-klmhb\") pod \"ceilometer-0\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.163032 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.273875 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="eeb3ea7a-c4b5-4f0d-b4e6-31c0699bd1b3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": dial tcp 10.217.0.164:3000: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.615078 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.743447 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerStarted","Data":"8bb7382ee511ac977716b7468417b3abce394f0a99923c532789930b6f01c20c"} Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.981964 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 10:46:34 crc kubenswrapper[4740]: I1009 10:46:34.982012 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 10:46:35 crc kubenswrapper[4740]: I1009 10:46:35.012018 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 10:46:35 crc kubenswrapper[4740]: I1009 10:46:35.035105 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 10:46:35 crc kubenswrapper[4740]: I1009 10:46:35.411353 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:46:35 crc kubenswrapper[4740]: I1009 10:46:35.411817 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:46:35 crc kubenswrapper[4740]: I1009 10:46:35.753651 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerStarted","Data":"8ca2c805a48391eb7f17c9b40203acd90f090fa047d96f13d08cc43e18cbd74e"} Oct 09 10:46:35 crc kubenswrapper[4740]: I1009 10:46:35.754263 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 10:46:35 crc kubenswrapper[4740]: I1009 10:46:35.754288 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 10:46:35 crc kubenswrapper[4740]: I1009 10:46:35.795971 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa61d9ff-5491-44c2-8004-908ed41ffad5" path="/var/lib/kubelet/pods/aa61d9ff-5491-44c2-8004-908ed41ffad5/volumes" Oct 09 10:46:36 crc kubenswrapper[4740]: I1009 10:46:36.336266 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:36 crc kubenswrapper[4740]: I1009 10:46:36.336316 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:36 crc kubenswrapper[4740]: I1009 10:46:36.374683 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:36 crc kubenswrapper[4740]: I1009 10:46:36.388806 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:36 crc kubenswrapper[4740]: I1009 10:46:36.763684 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerStarted","Data":"4e048d1a005aaeea374f9d6537232f6ec218752cea298ff87443a25582513bcb"} Oct 09 10:46:36 crc kubenswrapper[4740]: I1009 10:46:36.763721 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerStarted","Data":"2427dc837e7ecd3f596e1b8961bb99f609986268b60d39d0bc7dabc880589b77"} Oct 09 10:46:36 crc kubenswrapper[4740]: I1009 10:46:36.764457 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:36 crc kubenswrapper[4740]: I1009 10:46:36.764856 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:37 crc kubenswrapper[4740]: I1009 10:46:37.619373 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 10:46:37 crc kubenswrapper[4740]: I1009 10:46:37.771388 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 10:46:37 crc kubenswrapper[4740]: I1009 10:46:37.833408 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 10:46:38 crc kubenswrapper[4740]: I1009 10:46:38.752487 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:38 crc kubenswrapper[4740]: I1009 10:46:38.765611 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 10:46:38 crc kubenswrapper[4740]: I1009 10:46:38.785427 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerStarted","Data":"8c7cb8aeb4be3d42b8ec3c91ff50df0195850a4e5c40bfb5ae6ffd304a47be1e"} Oct 09 10:46:38 crc kubenswrapper[4740]: I1009 10:46:38.839639 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.733403456 podStartE2EDuration="5.839624815s" podCreationTimestamp="2025-10-09 10:46:33 +0000 UTC" firstStartedPulling="2025-10-09 10:46:34.610300805 +0000 UTC m=+1133.572501186" lastFinishedPulling="2025-10-09 10:46:37.716522164 +0000 UTC m=+1136.678722545" observedRunningTime="2025-10-09 10:46:38.833669444 +0000 UTC m=+1137.795869815" watchObservedRunningTime="2025-10-09 10:46:38.839624815 +0000 UTC m=+1137.801825196" Oct 09 10:46:39 crc kubenswrapper[4740]: I1009 10:46:39.792965 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 10:46:44 crc kubenswrapper[4740]: I1009 10:46:44.863398 4740 generic.go:334] "Generic (PLEG): container finished" podID="a69aabb5-1a08-483b-b60b-65080c36912c" containerID="7423e2ae9cc7d138508f764b745adfcca9c4d4c73313b74e561ecafeef343599" exitCode=0 Oct 09 10:46:44 crc kubenswrapper[4740]: I1009 10:46:44.863488 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" event={"ID":"a69aabb5-1a08-483b-b60b-65080c36912c","Type":"ContainerDied","Data":"7423e2ae9cc7d138508f764b745adfcca9c4d4c73313b74e561ecafeef343599"} Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.265888 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.438646 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-config-data\") pod \"a69aabb5-1a08-483b-b60b-65080c36912c\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.438715 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-scripts\") pod \"a69aabb5-1a08-483b-b60b-65080c36912c\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.438823 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qfvb\" (UniqueName: \"kubernetes.io/projected/a69aabb5-1a08-483b-b60b-65080c36912c-kube-api-access-5qfvb\") pod \"a69aabb5-1a08-483b-b60b-65080c36912c\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.438884 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-combined-ca-bundle\") pod \"a69aabb5-1a08-483b-b60b-65080c36912c\" (UID: \"a69aabb5-1a08-483b-b60b-65080c36912c\") " Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.447198 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69aabb5-1a08-483b-b60b-65080c36912c-kube-api-access-5qfvb" (OuterVolumeSpecName: "kube-api-access-5qfvb") pod "a69aabb5-1a08-483b-b60b-65080c36912c" (UID: "a69aabb5-1a08-483b-b60b-65080c36912c"). InnerVolumeSpecName "kube-api-access-5qfvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.447226 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-scripts" (OuterVolumeSpecName: "scripts") pod "a69aabb5-1a08-483b-b60b-65080c36912c" (UID: "a69aabb5-1a08-483b-b60b-65080c36912c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.466350 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-config-data" (OuterVolumeSpecName: "config-data") pod "a69aabb5-1a08-483b-b60b-65080c36912c" (UID: "a69aabb5-1a08-483b-b60b-65080c36912c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.472465 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a69aabb5-1a08-483b-b60b-65080c36912c" (UID: "a69aabb5-1a08-483b-b60b-65080c36912c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.541980 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.542030 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.542047 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qfvb\" (UniqueName: \"kubernetes.io/projected/a69aabb5-1a08-483b-b60b-65080c36912c-kube-api-access-5qfvb\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.542061 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69aabb5-1a08-483b-b60b-65080c36912c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.887297 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" event={"ID":"a69aabb5-1a08-483b-b60b-65080c36912c","Type":"ContainerDied","Data":"6b4f6035eb5a4dd2537ae221fe9dd5a8ad4809561297ad89376ea3b7348ffd37"} Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.887328 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gwq7v" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.887379 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b4f6035eb5a4dd2537ae221fe9dd5a8ad4809561297ad89376ea3b7348ffd37" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.970916 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 10:46:46 crc kubenswrapper[4740]: E1009 10:46:46.971345 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69aabb5-1a08-483b-b60b-65080c36912c" containerName="nova-cell0-conductor-db-sync" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.971369 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69aabb5-1a08-483b-b60b-65080c36912c" containerName="nova-cell0-conductor-db-sync" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.971621 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69aabb5-1a08-483b-b60b-65080c36912c" containerName="nova-cell0-conductor-db-sync" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.972571 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.974600 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.974815 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qd29k" Oct 09 10:46:46 crc kubenswrapper[4740]: I1009 10:46:46.991516 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.152705 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305a12c3-450d-43fc-87bb-9bb293438451-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.152873 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvkpn\" (UniqueName: \"kubernetes.io/projected/305a12c3-450d-43fc-87bb-9bb293438451-kube-api-access-pvkpn\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.152912 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305a12c3-450d-43fc-87bb-9bb293438451-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.254919 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvkpn\" (UniqueName: \"kubernetes.io/projected/305a12c3-450d-43fc-87bb-9bb293438451-kube-api-access-pvkpn\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.255308 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305a12c3-450d-43fc-87bb-9bb293438451-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.255431 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305a12c3-450d-43fc-87bb-9bb293438451-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.262202 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305a12c3-450d-43fc-87bb-9bb293438451-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.263385 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305a12c3-450d-43fc-87bb-9bb293438451-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.278473 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvkpn\" (UniqueName: \"kubernetes.io/projected/305a12c3-450d-43fc-87bb-9bb293438451-kube-api-access-pvkpn\") pod \"nova-cell0-conductor-0\" (UID: \"305a12c3-450d-43fc-87bb-9bb293438451\") " pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.292017 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:47 crc kubenswrapper[4740]: W1009 10:46:47.763721 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod305a12c3_450d_43fc_87bb_9bb293438451.slice/crio-3cc423e6c67e34d2614fa620137d77db5db81b579eba36be65be3dafd2a5e8de WatchSource:0}: Error finding container 3cc423e6c67e34d2614fa620137d77db5db81b579eba36be65be3dafd2a5e8de: Status 404 returned error can't find the container with id 3cc423e6c67e34d2614fa620137d77db5db81b579eba36be65be3dafd2a5e8de Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.769229 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 10:46:47 crc kubenswrapper[4740]: I1009 10:46:47.897531 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"305a12c3-450d-43fc-87bb-9bb293438451","Type":"ContainerStarted","Data":"3cc423e6c67e34d2614fa620137d77db5db81b579eba36be65be3dafd2a5e8de"} Oct 09 10:46:48 crc kubenswrapper[4740]: I1009 10:46:48.921728 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"305a12c3-450d-43fc-87bb-9bb293438451","Type":"ContainerStarted","Data":"26254a821ac6354dcd038caa15f6837d1195746f8ae62c6bd6eba00d179ab794"} Oct 09 10:46:48 crc kubenswrapper[4740]: I1009 10:46:48.923567 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:48 crc kubenswrapper[4740]: I1009 10:46:48.958427 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.958408565 podStartE2EDuration="2.958408565s" podCreationTimestamp="2025-10-09 10:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:46:48.94936553 +0000 UTC m=+1147.911565921" watchObservedRunningTime="2025-10-09 10:46:48.958408565 +0000 UTC m=+1147.920608946" Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.328100 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.855984 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-c7th8"] Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.858144 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.860582 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.860854 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.873264 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-c7th8"] Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.975581 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-scripts\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.975620 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.975663 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-config-data\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:57 crc kubenswrapper[4740]: I1009 10:46:57.975858 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82jw8\" (UniqueName: \"kubernetes.io/projected/6af75d27-96e9-44d0-95cc-d0137b792f96-kube-api-access-82jw8\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.077351 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-scripts\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.077418 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.077445 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-config-data\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.077491 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82jw8\" (UniqueName: \"kubernetes.io/projected/6af75d27-96e9-44d0-95cc-d0137b792f96-kube-api-access-82jw8\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.094557 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-config-data\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.099391 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-scripts\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.104415 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.104481 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82jw8\" (UniqueName: \"kubernetes.io/projected/6af75d27-96e9-44d0-95cc-d0137b792f96-kube-api-access-82jw8\") pod \"nova-cell0-cell-mapping-c7th8\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.106917 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.108400 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.111725 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.130295 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.200260 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.245292 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.247052 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.251099 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.266808 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.268532 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.274170 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.284323 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.284430 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-config-data\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.284527 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmjlj\" (UniqueName: \"kubernetes.io/projected/32448d24-361d-4fbd-934b-404da232f445-kube-api-access-fmjlj\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.284892 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32448d24-361d-4fbd-934b-404da232f445-logs\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.297326 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.310572 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.353838 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.355111 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.359648 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.362677 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.374981 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z977l"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.376715 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386533 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386567 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-config-data\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386591 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32448d24-361d-4fbd-934b-404da232f445-logs\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386628 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-config-data\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386649 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386666 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q7ns\" (UniqueName: \"kubernetes.io/projected/c617a5ac-a683-46f1-989d-d3508405577a-kube-api-access-7q7ns\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386686 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-logs\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386723 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vw78\" (UniqueName: \"kubernetes.io/projected/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-kube-api-access-9vw78\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386740 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386774 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-config-data\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.386841 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmjlj\" (UniqueName: \"kubernetes.io/projected/32448d24-361d-4fbd-934b-404da232f445-kube-api-access-fmjlj\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.388030 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32448d24-361d-4fbd-934b-404da232f445-logs\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.390406 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z977l"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.402451 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.405932 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-config-data\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.423356 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmjlj\" (UniqueName: \"kubernetes.io/projected/32448d24-361d-4fbd-934b-404da232f445-kube-api-access-fmjlj\") pod \"nova-api-0\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.493393 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.493450 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-config-data\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.493609 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.493942 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-config-data\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.493986 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494013 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q7ns\" (UniqueName: \"kubernetes.io/projected/c617a5ac-a683-46f1-989d-d3508405577a-kube-api-access-7q7ns\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494052 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-logs\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494095 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494141 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494210 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vw78\" (UniqueName: \"kubernetes.io/projected/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-kube-api-access-9vw78\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494338 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494374 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-config\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494413 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494435 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pb56\" (UniqueName: \"kubernetes.io/projected/f5a8492c-3f09-4613-a24f-3f17de65767d-kube-api-access-9pb56\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494487 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494521 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8679x\" (UniqueName: \"kubernetes.io/projected/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-kube-api-access-8679x\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.494585 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-logs\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.497016 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.503398 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-config-data\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.503490 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-config-data\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.509001 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.509081 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.511843 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q7ns\" (UniqueName: \"kubernetes.io/projected/c617a5ac-a683-46f1-989d-d3508405577a-kube-api-access-7q7ns\") pod \"nova-scheduler-0\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.513230 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vw78\" (UniqueName: \"kubernetes.io/projected/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-kube-api-access-9vw78\") pod \"nova-metadata-0\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596046 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596097 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596183 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596203 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-config\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596228 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596242 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pb56\" (UniqueName: \"kubernetes.io/projected/f5a8492c-3f09-4613-a24f-3f17de65767d-kube-api-access-9pb56\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596269 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596287 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8679x\" (UniqueName: \"kubernetes.io/projected/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-kube-api-access-8679x\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.596321 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.597175 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.597727 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.597939 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-config\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.598258 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.598567 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.606316 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.607218 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.615197 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8679x\" (UniqueName: \"kubernetes.io/projected/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-kube-api-access-8679x\") pod \"nova-cell1-novncproxy-0\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.617094 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pb56\" (UniqueName: \"kubernetes.io/projected/f5a8492c-3f09-4613-a24f-3f17de65767d-kube-api-access-9pb56\") pod \"dnsmasq-dns-845d6d6f59-z977l\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.652275 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.677834 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.704866 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.791374 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.862010 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-c7th8"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.915878 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kncvx"] Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.917055 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.920142 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.920428 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 09 10:46:58 crc kubenswrapper[4740]: I1009 10:46:58.929643 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kncvx"] Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.001191 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.049042 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c7th8" event={"ID":"6af75d27-96e9-44d0-95cc-d0137b792f96","Type":"ContainerStarted","Data":"908c171c17b665424e9ae8ae6b0c79a91b7257a8dc2ab0715927c86873107477"} Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.110667 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-config-data\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.110769 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-scripts\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.110792 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.110891 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49lf6\" (UniqueName: \"kubernetes.io/projected/82f30f7b-d441-4f72-aa2b-9fd450738e6d-kube-api-access-49lf6\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.180557 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.212958 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49lf6\" (UniqueName: \"kubernetes.io/projected/82f30f7b-d441-4f72-aa2b-9fd450738e6d-kube-api-access-49lf6\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.213028 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-config-data\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.213084 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-scripts\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.213101 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.224325 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-config-data\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.231403 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.242550 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-scripts\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: W1009 10:46:59.273420 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c0cdd0_b92a_46ef_a4bf_37c608843f0e.slice/crio-e74f2c26ce013407599cd32356830e318455465e5883d84b79820fc47817362d WatchSource:0}: Error finding container e74f2c26ce013407599cd32356830e318455465e5883d84b79820fc47817362d: Status 404 returned error can't find the container with id e74f2c26ce013407599cd32356830e318455465e5883d84b79820fc47817362d Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.274202 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49lf6\" (UniqueName: \"kubernetes.io/projected/82f30f7b-d441-4f72-aa2b-9fd450738e6d-kube-api-access-49lf6\") pod \"nova-cell1-conductor-db-sync-kncvx\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.422359 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.569925 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.581963 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.703743 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z977l"] Oct 09 10:46:59 crc kubenswrapper[4740]: W1009 10:46:59.713245 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a8492c_3f09_4613_a24f_3f17de65767d.slice/crio-66c97661f3d2aedcc9364e7ecd58a78c5e621896ceb1fc6fe140124dace37f37 WatchSource:0}: Error finding container 66c97661f3d2aedcc9364e7ecd58a78c5e621896ceb1fc6fe140124dace37f37: Status 404 returned error can't find the container with id 66c97661f3d2aedcc9364e7ecd58a78c5e621896ceb1fc6fe140124dace37f37 Oct 09 10:46:59 crc kubenswrapper[4740]: I1009 10:46:59.990277 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kncvx"] Oct 09 10:47:00 crc kubenswrapper[4740]: W1009 10:47:00.047906 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82f30f7b_d441_4f72_aa2b_9fd450738e6d.slice/crio-2a5569e09bfe2424de919839f64321547f2ec067a789794a4480ae4cbed63783 WatchSource:0}: Error finding container 2a5569e09bfe2424de919839f64321547f2ec067a789794a4480ae4cbed63783: Status 404 returned error can't find the container with id 2a5569e09bfe2424de919839f64321547f2ec067a789794a4480ae4cbed63783 Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.060584 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48c0cdd0-b92a-46ef-a4bf-37c608843f0e","Type":"ContainerStarted","Data":"e74f2c26ce013407599cd32356830e318455465e5883d84b79820fc47817362d"} Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.063675 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d","Type":"ContainerStarted","Data":"6023281e53d3e1d9f46ca49a4af0620bbc2a17469f60898cf65a31292b072544"} Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.066901 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c7th8" event={"ID":"6af75d27-96e9-44d0-95cc-d0137b792f96","Type":"ContainerStarted","Data":"076989a1e587d3337b54a7081133dd69bf8bef51967206bbb774a5c8d3669522"} Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.068714 4740 generic.go:334] "Generic (PLEG): container finished" podID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerID="6dd8b68ec2e4d8e9395e33995271870ffc141bc52635d64b60da17c2974a8b2c" exitCode=0 Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.068789 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" event={"ID":"f5a8492c-3f09-4613-a24f-3f17de65767d","Type":"ContainerDied","Data":"6dd8b68ec2e4d8e9395e33995271870ffc141bc52635d64b60da17c2974a8b2c"} Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.068807 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" event={"ID":"f5a8492c-3f09-4613-a24f-3f17de65767d","Type":"ContainerStarted","Data":"66c97661f3d2aedcc9364e7ecd58a78c5e621896ceb1fc6fe140124dace37f37"} Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.070333 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kncvx" event={"ID":"82f30f7b-d441-4f72-aa2b-9fd450738e6d","Type":"ContainerStarted","Data":"2a5569e09bfe2424de919839f64321547f2ec067a789794a4480ae4cbed63783"} Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.080169 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32448d24-361d-4fbd-934b-404da232f445","Type":"ContainerStarted","Data":"eb71b4131c4d8dbc6dc5646a33e6b58e7470d3dc657607a873c6c21c5c0b064d"} Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.086327 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-c7th8" podStartSLOduration=3.086309359 podStartE2EDuration="3.086309359s" podCreationTimestamp="2025-10-09 10:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:00.081291673 +0000 UTC m=+1159.043492064" watchObservedRunningTime="2025-10-09 10:47:00.086309359 +0000 UTC m=+1159.048509740" Oct 09 10:47:00 crc kubenswrapper[4740]: I1009 10:47:00.099703 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c617a5ac-a683-46f1-989d-d3508405577a","Type":"ContainerStarted","Data":"7cfcb8bde60886609043a9d1e0172b84f88a93d9772aeaf071b56cabd480f6ff"} Oct 09 10:47:01 crc kubenswrapper[4740]: I1009 10:47:01.117394 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" event={"ID":"f5a8492c-3f09-4613-a24f-3f17de65767d","Type":"ContainerStarted","Data":"eff4fe15174ef6aec62b353465763dde306566a9255148808771642fdd0c4772"} Oct 09 10:47:01 crc kubenswrapper[4740]: I1009 10:47:01.117701 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:47:01 crc kubenswrapper[4740]: I1009 10:47:01.119013 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kncvx" event={"ID":"82f30f7b-d441-4f72-aa2b-9fd450738e6d","Type":"ContainerStarted","Data":"57943945125a869628425fad9f5d33c104a0605c5da0aa105261f713413e4840"} Oct 09 10:47:01 crc kubenswrapper[4740]: I1009 10:47:01.141678 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" podStartSLOduration=3.141660225 podStartE2EDuration="3.141660225s" podCreationTimestamp="2025-10-09 10:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:01.132352613 +0000 UTC m=+1160.094552984" watchObservedRunningTime="2025-10-09 10:47:01.141660225 +0000 UTC m=+1160.103860606" Oct 09 10:47:01 crc kubenswrapper[4740]: I1009 10:47:01.157524 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-kncvx" podStartSLOduration=3.157509015 podStartE2EDuration="3.157509015s" podCreationTimestamp="2025-10-09 10:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:01.154135293 +0000 UTC m=+1160.116335674" watchObservedRunningTime="2025-10-09 10:47:01.157509015 +0000 UTC m=+1160.119709396" Oct 09 10:47:01 crc kubenswrapper[4740]: I1009 10:47:01.907324 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:01 crc kubenswrapper[4740]: I1009 10:47:01.973896 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.151534 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48c0cdd0-b92a-46ef-a4bf-37c608843f0e","Type":"ContainerStarted","Data":"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec"} Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.152178 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48c0cdd0-b92a-46ef-a4bf-37c608843f0e","Type":"ContainerStarted","Data":"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101"} Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.154423 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d","Type":"ContainerStarted","Data":"4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7"} Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.154564 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7" gracePeriod=30 Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.158738 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32448d24-361d-4fbd-934b-404da232f445","Type":"ContainerStarted","Data":"00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d"} Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.163121 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c617a5ac-a683-46f1-989d-d3508405577a","Type":"ContainerStarted","Data":"5c4a279758c20069c890cfb4ff4efcdb45c5a6fb92ef0c79f92b563c8af3a2fb"} Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.174636 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.03538452 podStartE2EDuration="5.174612663s" podCreationTimestamp="2025-10-09 10:46:58 +0000 UTC" firstStartedPulling="2025-10-09 10:46:59.440198578 +0000 UTC m=+1158.402398959" lastFinishedPulling="2025-10-09 10:47:02.579426721 +0000 UTC m=+1161.541627102" observedRunningTime="2025-10-09 10:47:03.169889245 +0000 UTC m=+1162.132089636" watchObservedRunningTime="2025-10-09 10:47:03.174612663 +0000 UTC m=+1162.136813044" Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.196690 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.219275591 podStartE2EDuration="5.19666385s" podCreationTimestamp="2025-10-09 10:46:58 +0000 UTC" firstStartedPulling="2025-10-09 10:46:59.604698524 +0000 UTC m=+1158.566898905" lastFinishedPulling="2025-10-09 10:47:02.582086793 +0000 UTC m=+1161.544287164" observedRunningTime="2025-10-09 10:47:03.186060863 +0000 UTC m=+1162.148261254" watchObservedRunningTime="2025-10-09 10:47:03.19666385 +0000 UTC m=+1162.158864231" Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.208255 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.6608857160000001 podStartE2EDuration="5.208237064s" podCreationTimestamp="2025-10-09 10:46:58 +0000 UTC" firstStartedPulling="2025-10-09 10:46:59.031125167 +0000 UTC m=+1157.993325548" lastFinishedPulling="2025-10-09 10:47:02.578476505 +0000 UTC m=+1161.540676896" observedRunningTime="2025-10-09 10:47:03.203615768 +0000 UTC m=+1162.165816149" watchObservedRunningTime="2025-10-09 10:47:03.208237064 +0000 UTC m=+1162.170437445" Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.679420 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 10:47:03 crc kubenswrapper[4740]: I1009 10:47:03.705894 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.169068 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.177929 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32448d24-361d-4fbd-934b-404da232f445","Type":"ContainerStarted","Data":"b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af"} Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.179485 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerName="nova-metadata-log" containerID="cri-o://82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101" gracePeriod=30 Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.179933 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerName="nova-metadata-metadata" containerID="cri-o://3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec" gracePeriod=30 Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.250863 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.973010167 podStartE2EDuration="6.250844194s" podCreationTimestamp="2025-10-09 10:46:58 +0000 UTC" firstStartedPulling="2025-10-09 10:46:59.286469214 +0000 UTC m=+1158.248669595" lastFinishedPulling="2025-10-09 10:47:02.564303231 +0000 UTC m=+1161.526503622" observedRunningTime="2025-10-09 10:47:04.236244259 +0000 UTC m=+1163.198444670" watchObservedRunningTime="2025-10-09 10:47:04.250844194 +0000 UTC m=+1163.213044575" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.765098 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.834617 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-logs\") pod \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.834666 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vw78\" (UniqueName: \"kubernetes.io/projected/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-kube-api-access-9vw78\") pod \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.834684 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-combined-ca-bundle\") pod \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.834794 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-config-data\") pod \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\" (UID: \"48c0cdd0-b92a-46ef-a4bf-37c608843f0e\") " Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.836944 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-logs" (OuterVolumeSpecName: "logs") pod "48c0cdd0-b92a-46ef-a4bf-37c608843f0e" (UID: "48c0cdd0-b92a-46ef-a4bf-37c608843f0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.841976 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-kube-api-access-9vw78" (OuterVolumeSpecName: "kube-api-access-9vw78") pod "48c0cdd0-b92a-46ef-a4bf-37c608843f0e" (UID: "48c0cdd0-b92a-46ef-a4bf-37c608843f0e"). InnerVolumeSpecName "kube-api-access-9vw78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.865491 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48c0cdd0-b92a-46ef-a4bf-37c608843f0e" (UID: "48c0cdd0-b92a-46ef-a4bf-37c608843f0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.880287 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-config-data" (OuterVolumeSpecName: "config-data") pod "48c0cdd0-b92a-46ef-a4bf-37c608843f0e" (UID: "48c0cdd0-b92a-46ef-a4bf-37c608843f0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.938167 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.938204 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.938218 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vw78\" (UniqueName: \"kubernetes.io/projected/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-kube-api-access-9vw78\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:04 crc kubenswrapper[4740]: I1009 10:47:04.938231 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c0cdd0-b92a-46ef-a4bf-37c608843f0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.189623 4740 generic.go:334] "Generic (PLEG): container finished" podID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerID="3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec" exitCode=0 Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.189694 4740 generic.go:334] "Generic (PLEG): container finished" podID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerID="82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101" exitCode=143 Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.189808 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.189705 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48c0cdd0-b92a-46ef-a4bf-37c608843f0e","Type":"ContainerDied","Data":"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec"} Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.189950 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48c0cdd0-b92a-46ef-a4bf-37c608843f0e","Type":"ContainerDied","Data":"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101"} Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.190018 4740 scope.go:117] "RemoveContainer" containerID="3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.190235 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48c0cdd0-b92a-46ef-a4bf-37c608843f0e","Type":"ContainerDied","Data":"e74f2c26ce013407599cd32356830e318455465e5883d84b79820fc47817362d"} Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.227103 4740 scope.go:117] "RemoveContainer" containerID="82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.247816 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.252396 4740 scope.go:117] "RemoveContainer" containerID="3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec" Oct 09 10:47:05 crc kubenswrapper[4740]: E1009 10:47:05.253450 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec\": container with ID starting with 3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec not found: ID does not exist" containerID="3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.253518 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec"} err="failed to get container status \"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec\": rpc error: code = NotFound desc = could not find container \"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec\": container with ID starting with 3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec not found: ID does not exist" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.253559 4740 scope.go:117] "RemoveContainer" containerID="82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101" Oct 09 10:47:05 crc kubenswrapper[4740]: E1009 10:47:05.254688 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101\": container with ID starting with 82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101 not found: ID does not exist" containerID="82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.254792 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101"} err="failed to get container status \"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101\": rpc error: code = NotFound desc = could not find container \"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101\": container with ID starting with 82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101 not found: ID does not exist" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.254848 4740 scope.go:117] "RemoveContainer" containerID="3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.258395 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec"} err="failed to get container status \"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec\": rpc error: code = NotFound desc = could not find container \"3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec\": container with ID starting with 3d26b1c43e7838d916d571e9cf3d266808fa7e84ad221ed3e855e13e3c2e2bec not found: ID does not exist" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.258459 4740 scope.go:117] "RemoveContainer" containerID="82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.262686 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101"} err="failed to get container status \"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101\": rpc error: code = NotFound desc = could not find container \"82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101\": container with ID starting with 82ee0a341f2479de6ab97a083689588cb0efcc34f274fa5fe2ea9fc91b203101 not found: ID does not exist" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.291127 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.300822 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:05 crc kubenswrapper[4740]: E1009 10:47:05.301254 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerName="nova-metadata-metadata" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.301276 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerName="nova-metadata-metadata" Oct 09 10:47:05 crc kubenswrapper[4740]: E1009 10:47:05.301306 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerName="nova-metadata-log" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.301317 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerName="nova-metadata-log" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.301537 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerName="nova-metadata-metadata" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.301571 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" containerName="nova-metadata-log" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.303571 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.311149 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.318940 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.319233 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.349427 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.349520 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n62cf\" (UniqueName: \"kubernetes.io/projected/f4c2bc7d-e151-47df-8a00-e895b880cb93-kube-api-access-n62cf\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.349590 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.349655 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-config-data\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.349797 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c2bc7d-e151-47df-8a00-e895b880cb93-logs\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.407532 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.407586 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.407639 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.408511 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fbbd1d786738a0dbe0197a069ad3e53334cad14f3901ee957620b2bd7f765083"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.408566 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://fbbd1d786738a0dbe0197a069ad3e53334cad14f3901ee957620b2bd7f765083" gracePeriod=600 Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.451550 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c2bc7d-e151-47df-8a00-e895b880cb93-logs\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.452464 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c2bc7d-e151-47df-8a00-e895b880cb93-logs\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.453746 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.453849 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n62cf\" (UniqueName: \"kubernetes.io/projected/f4c2bc7d-e151-47df-8a00-e895b880cb93-kube-api-access-n62cf\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.453893 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.453973 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-config-data\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.459467 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.460214 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.465477 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-config-data\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.474911 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n62cf\" (UniqueName: \"kubernetes.io/projected/f4c2bc7d-e151-47df-8a00-e895b880cb93-kube-api-access-n62cf\") pod \"nova-metadata-0\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.663377 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:05 crc kubenswrapper[4740]: I1009 10:47:05.771783 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c0cdd0-b92a-46ef-a4bf-37c608843f0e" path="/var/lib/kubelet/pods/48c0cdd0-b92a-46ef-a4bf-37c608843f0e/volumes" Oct 09 10:47:06 crc kubenswrapper[4740]: I1009 10:47:06.189992 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:06 crc kubenswrapper[4740]: W1009 10:47:06.193917 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4c2bc7d_e151_47df_8a00_e895b880cb93.slice/crio-1631e09476c54459d9211380eb78ca62d260554b8386d17baa0fc2e4310ac921 WatchSource:0}: Error finding container 1631e09476c54459d9211380eb78ca62d260554b8386d17baa0fc2e4310ac921: Status 404 returned error can't find the container with id 1631e09476c54459d9211380eb78ca62d260554b8386d17baa0fc2e4310ac921 Oct 09 10:47:06 crc kubenswrapper[4740]: I1009 10:47:06.203429 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="fbbd1d786738a0dbe0197a069ad3e53334cad14f3901ee957620b2bd7f765083" exitCode=0 Oct 09 10:47:06 crc kubenswrapper[4740]: I1009 10:47:06.203476 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"fbbd1d786738a0dbe0197a069ad3e53334cad14f3901ee957620b2bd7f765083"} Oct 09 10:47:06 crc kubenswrapper[4740]: I1009 10:47:06.203506 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"198944336c8d712e7c21457778dd2b3f352b6a523b8c1f1ec0b48f4c6d926ff3"} Oct 09 10:47:06 crc kubenswrapper[4740]: I1009 10:47:06.203527 4740 scope.go:117] "RemoveContainer" containerID="db6bdc02b2d1bf480bf563dc4b4a9b65b436c587d39e3c847d517ccd6a5d7f1c" Oct 09 10:47:07 crc kubenswrapper[4740]: I1009 10:47:07.212554 4740 generic.go:334] "Generic (PLEG): container finished" podID="6af75d27-96e9-44d0-95cc-d0137b792f96" containerID="076989a1e587d3337b54a7081133dd69bf8bef51967206bbb774a5c8d3669522" exitCode=0 Oct 09 10:47:07 crc kubenswrapper[4740]: I1009 10:47:07.212645 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c7th8" event={"ID":"6af75d27-96e9-44d0-95cc-d0137b792f96","Type":"ContainerDied","Data":"076989a1e587d3337b54a7081133dd69bf8bef51967206bbb774a5c8d3669522"} Oct 09 10:47:07 crc kubenswrapper[4740]: I1009 10:47:07.216338 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4c2bc7d-e151-47df-8a00-e895b880cb93","Type":"ContainerStarted","Data":"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415"} Oct 09 10:47:07 crc kubenswrapper[4740]: I1009 10:47:07.216377 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4c2bc7d-e151-47df-8a00-e895b880cb93","Type":"ContainerStarted","Data":"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb"} Oct 09 10:47:07 crc kubenswrapper[4740]: I1009 10:47:07.216387 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4c2bc7d-e151-47df-8a00-e895b880cb93","Type":"ContainerStarted","Data":"1631e09476c54459d9211380eb78ca62d260554b8386d17baa0fc2e4310ac921"} Oct 09 10:47:07 crc kubenswrapper[4740]: I1009 10:47:07.256505 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.256484177 podStartE2EDuration="2.256484177s" podCreationTimestamp="2025-10-09 10:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:07.242060452 +0000 UTC m=+1166.204260843" watchObservedRunningTime="2025-10-09 10:47:07.256484177 +0000 UTC m=+1166.218684568" Oct 09 10:47:07 crc kubenswrapper[4740]: I1009 10:47:07.976235 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:47:07 crc kubenswrapper[4740]: I1009 10:47:07.976871 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="97026040-37a7-4aa6-aad1-a9b204d4d329" containerName="kube-state-metrics" containerID="cri-o://97072505b68eafbf40eeaa31da55cadfb06b4e193e60ca5d17cde2ac2256a303" gracePeriod=30 Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.124579 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="97026040-37a7-4aa6-aad1-a9b204d4d329" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": dial tcp 10.217.0.105:8081: connect: connection refused" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.240116 4740 generic.go:334] "Generic (PLEG): container finished" podID="97026040-37a7-4aa6-aad1-a9b204d4d329" containerID="97072505b68eafbf40eeaa31da55cadfb06b4e193e60ca5d17cde2ac2256a303" exitCode=2 Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.240197 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"97026040-37a7-4aa6-aad1-a9b204d4d329","Type":"ContainerDied","Data":"97072505b68eafbf40eeaa31da55cadfb06b4e193e60ca5d17cde2ac2256a303"} Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.243026 4740 generic.go:334] "Generic (PLEG): container finished" podID="82f30f7b-d441-4f72-aa2b-9fd450738e6d" containerID="57943945125a869628425fad9f5d33c104a0605c5da0aa105261f713413e4840" exitCode=0 Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.243107 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kncvx" event={"ID":"82f30f7b-d441-4f72-aa2b-9fd450738e6d","Type":"ContainerDied","Data":"57943945125a869628425fad9f5d33c104a0605c5da0aa105261f713413e4840"} Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.497322 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.498010 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.538121 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.621822 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97zmg\" (UniqueName: \"kubernetes.io/projected/97026040-37a7-4aa6-aad1-a9b204d4d329-kube-api-access-97zmg\") pod \"97026040-37a7-4aa6-aad1-a9b204d4d329\" (UID: \"97026040-37a7-4aa6-aad1-a9b204d4d329\") " Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.631049 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97026040-37a7-4aa6-aad1-a9b204d4d329-kube-api-access-97zmg" (OuterVolumeSpecName: "kube-api-access-97zmg") pod "97026040-37a7-4aa6-aad1-a9b204d4d329" (UID: "97026040-37a7-4aa6-aad1-a9b204d4d329"). InnerVolumeSpecName "kube-api-access-97zmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.657955 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.682023 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.728336 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-config-data\") pod \"6af75d27-96e9-44d0-95cc-d0137b792f96\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.728464 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-combined-ca-bundle\") pod \"6af75d27-96e9-44d0-95cc-d0137b792f96\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.728586 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82jw8\" (UniqueName: \"kubernetes.io/projected/6af75d27-96e9-44d0-95cc-d0137b792f96-kube-api-access-82jw8\") pod \"6af75d27-96e9-44d0-95cc-d0137b792f96\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.728690 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-scripts\") pod \"6af75d27-96e9-44d0-95cc-d0137b792f96\" (UID: \"6af75d27-96e9-44d0-95cc-d0137b792f96\") " Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.729243 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97zmg\" (UniqueName: \"kubernetes.io/projected/97026040-37a7-4aa6-aad1-a9b204d4d329-kube-api-access-97zmg\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.737901 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-scripts" (OuterVolumeSpecName: "scripts") pod "6af75d27-96e9-44d0-95cc-d0137b792f96" (UID: "6af75d27-96e9-44d0-95cc-d0137b792f96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.749083 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.754209 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af75d27-96e9-44d0-95cc-d0137b792f96-kube-api-access-82jw8" (OuterVolumeSpecName: "kube-api-access-82jw8") pod "6af75d27-96e9-44d0-95cc-d0137b792f96" (UID: "6af75d27-96e9-44d0-95cc-d0137b792f96"). InnerVolumeSpecName "kube-api-access-82jw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.775870 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-config-data" (OuterVolumeSpecName: "config-data") pod "6af75d27-96e9-44d0-95cc-d0137b792f96" (UID: "6af75d27-96e9-44d0-95cc-d0137b792f96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.808890 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.832136 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82jw8\" (UniqueName: \"kubernetes.io/projected/6af75d27-96e9-44d0-95cc-d0137b792f96-kube-api-access-82jw8\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.832176 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.832188 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.853924 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6af75d27-96e9-44d0-95cc-d0137b792f96" (UID: "6af75d27-96e9-44d0-95cc-d0137b792f96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.909576 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-8wzch"] Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.909815 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" podUID="6c511056-3872-4b33-92b4-51ff3fa0d287" containerName="dnsmasq-dns" containerID="cri-o://7ca0132ed94fddc367215f2a02e9ca5e33317f527172ccaf94b42c83456bb904" gracePeriod=10 Oct 09 10:47:08 crc kubenswrapper[4740]: I1009 10:47:08.934135 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af75d27-96e9-44d0-95cc-d0137b792f96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.266447 4740 generic.go:334] "Generic (PLEG): container finished" podID="6c511056-3872-4b33-92b4-51ff3fa0d287" containerID="7ca0132ed94fddc367215f2a02e9ca5e33317f527172ccaf94b42c83456bb904" exitCode=0 Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.266773 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" event={"ID":"6c511056-3872-4b33-92b4-51ff3fa0d287","Type":"ContainerDied","Data":"7ca0132ed94fddc367215f2a02e9ca5e33317f527172ccaf94b42c83456bb904"} Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.269005 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c7th8" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.269052 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c7th8" event={"ID":"6af75d27-96e9-44d0-95cc-d0137b792f96","Type":"ContainerDied","Data":"908c171c17b665424e9ae8ae6b0c79a91b7257a8dc2ab0715927c86873107477"} Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.269125 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="908c171c17b665424e9ae8ae6b0c79a91b7257a8dc2ab0715927c86873107477" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.283831 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"97026040-37a7-4aa6-aad1-a9b204d4d329","Type":"ContainerDied","Data":"b2afae48b8ac6f6b9c5abcb56f77dd46ba08b162589bd4f95c339c35231a27e1"} Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.283934 4740 scope.go:117] "RemoveContainer" containerID="97072505b68eafbf40eeaa31da55cadfb06b4e193e60ca5d17cde2ac2256a303" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.283973 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.338904 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.357592 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.362927 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.379975 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:47:09 crc kubenswrapper[4740]: E1009 10:47:09.380393 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97026040-37a7-4aa6-aad1-a9b204d4d329" containerName="kube-state-metrics" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.380407 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="97026040-37a7-4aa6-aad1-a9b204d4d329" containerName="kube-state-metrics" Oct 09 10:47:09 crc kubenswrapper[4740]: E1009 10:47:09.380415 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af75d27-96e9-44d0-95cc-d0137b792f96" containerName="nova-manage" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.380421 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af75d27-96e9-44d0-95cc-d0137b792f96" containerName="nova-manage" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.380623 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="97026040-37a7-4aa6-aad1-a9b204d4d329" containerName="kube-state-metrics" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.380641 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af75d27-96e9-44d0-95cc-d0137b792f96" containerName="nova-manage" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.381268 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.383470 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.383635 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.395312 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.449931 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.450066 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.450315 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79jkm\" (UniqueName: \"kubernetes.io/projected/36a3a627-eea7-4034-a615-38c388851e07-kube-api-access-79jkm\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.450355 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.517017 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.525515 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.529276 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.529466 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerName="nova-metadata-log" containerID="cri-o://22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb" gracePeriod=30 Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.529585 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerName="nova-metadata-metadata" containerID="cri-o://9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415" gracePeriod=30 Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.538065 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.551978 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.552078 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79jkm\" (UniqueName: \"kubernetes.io/projected/36a3a627-eea7-4034-a615-38c388851e07-kube-api-access-79jkm\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.553018 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.553170 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.563109 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.576591 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79jkm\" (UniqueName: \"kubernetes.io/projected/36a3a627-eea7-4034-a615-38c388851e07-kube-api-access-79jkm\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.588185 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.588260 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.606457 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a3a627-eea7-4034-a615-38c388851e07-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"36a3a627-eea7-4034-a615-38c388851e07\") " pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.755924 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-swift-storage-0\") pod \"6c511056-3872-4b33-92b4-51ff3fa0d287\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.755988 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-config\") pod \"6c511056-3872-4b33-92b4-51ff3fa0d287\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.756055 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpszj\" (UniqueName: \"kubernetes.io/projected/6c511056-3872-4b33-92b4-51ff3fa0d287-kube-api-access-cpszj\") pod \"6c511056-3872-4b33-92b4-51ff3fa0d287\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.756079 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-nb\") pod \"6c511056-3872-4b33-92b4-51ff3fa0d287\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.756219 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-sb\") pod \"6c511056-3872-4b33-92b4-51ff3fa0d287\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.756241 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-svc\") pod \"6c511056-3872-4b33-92b4-51ff3fa0d287\" (UID: \"6c511056-3872-4b33-92b4-51ff3fa0d287\") " Oct 09 10:47:09 crc kubenswrapper[4740]: E1009 10:47:09.759124 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4c2bc7d_e151_47df_8a00_e895b880cb93.slice/crio-22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb.scope\": RecentStats: unable to find data in memory cache]" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.773255 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97026040-37a7-4aa6-aad1-a9b204d4d329" path="/var/lib/kubelet/pods/97026040-37a7-4aa6-aad1-a9b204d4d329/volumes" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.790020 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c511056-3872-4b33-92b4-51ff3fa0d287-kube-api-access-cpszj" (OuterVolumeSpecName: "kube-api-access-cpszj") pod "6c511056-3872-4b33-92b4-51ff3fa0d287" (UID: "6c511056-3872-4b33-92b4-51ff3fa0d287"). InnerVolumeSpecName "kube-api-access-cpszj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.793407 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.824654 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.830353 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c511056-3872-4b33-92b4-51ff3fa0d287" (UID: "6c511056-3872-4b33-92b4-51ff3fa0d287"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.840557 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c511056-3872-4b33-92b4-51ff3fa0d287" (UID: "6c511056-3872-4b33-92b4-51ff3fa0d287"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.851420 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-config" (OuterVolumeSpecName: "config") pod "6c511056-3872-4b33-92b4-51ff3fa0d287" (UID: "6c511056-3872-4b33-92b4-51ff3fa0d287"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.861471 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.861512 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.861525 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpszj\" (UniqueName: \"kubernetes.io/projected/6c511056-3872-4b33-92b4-51ff3fa0d287-kube-api-access-cpszj\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.861539 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.863956 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c511056-3872-4b33-92b4-51ff3fa0d287" (UID: "6c511056-3872-4b33-92b4-51ff3fa0d287"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.864682 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c511056-3872-4b33-92b4-51ff3fa0d287" (UID: "6c511056-3872-4b33-92b4-51ff3fa0d287"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.962327 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-combined-ca-bundle\") pod \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.962399 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-config-data\") pod \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.962432 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-scripts\") pod \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.962473 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49lf6\" (UniqueName: \"kubernetes.io/projected/82f30f7b-d441-4f72-aa2b-9fd450738e6d-kube-api-access-49lf6\") pod \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\" (UID: \"82f30f7b-d441-4f72-aa2b-9fd450738e6d\") " Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.962954 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.962969 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c511056-3872-4b33-92b4-51ff3fa0d287-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.970783 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-scripts" (OuterVolumeSpecName: "scripts") pod "82f30f7b-d441-4f72-aa2b-9fd450738e6d" (UID: "82f30f7b-d441-4f72-aa2b-9fd450738e6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.973467 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:09 crc kubenswrapper[4740]: I1009 10:47:09.976864 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f30f7b-d441-4f72-aa2b-9fd450738e6d-kube-api-access-49lf6" (OuterVolumeSpecName: "kube-api-access-49lf6") pod "82f30f7b-d441-4f72-aa2b-9fd450738e6d" (UID: "82f30f7b-d441-4f72-aa2b-9fd450738e6d"). InnerVolumeSpecName "kube-api-access-49lf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.013153 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82f30f7b-d441-4f72-aa2b-9fd450738e6d" (UID: "82f30f7b-d441-4f72-aa2b-9fd450738e6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.031332 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-config-data" (OuterVolumeSpecName: "config-data") pod "82f30f7b-d441-4f72-aa2b-9fd450738e6d" (UID: "82f30f7b-d441-4f72-aa2b-9fd450738e6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.065324 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.065364 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.065375 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f30f7b-d441-4f72-aa2b-9fd450738e6d-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.065386 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49lf6\" (UniqueName: \"kubernetes.io/projected/82f30f7b-d441-4f72-aa2b-9fd450738e6d-kube-api-access-49lf6\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.205122 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.294779 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" event={"ID":"6c511056-3872-4b33-92b4-51ff3fa0d287","Type":"ContainerDied","Data":"482603ad9e072e7942b6ca67d60bbe30ad1d2f7b4096b8cd98fd9960e33eedee"} Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.294826 4740 scope.go:117] "RemoveContainer" containerID="7ca0132ed94fddc367215f2a02e9ca5e33317f527172ccaf94b42c83456bb904" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.294938 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-8wzch" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.297163 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerID="9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415" exitCode=0 Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.297201 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerID="22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb" exitCode=143 Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.297254 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4c2bc7d-e151-47df-8a00-e895b880cb93","Type":"ContainerDied","Data":"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415"} Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.297287 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4c2bc7d-e151-47df-8a00-e895b880cb93","Type":"ContainerDied","Data":"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb"} Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.297302 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4c2bc7d-e151-47df-8a00-e895b880cb93","Type":"ContainerDied","Data":"1631e09476c54459d9211380eb78ca62d260554b8386d17baa0fc2e4310ac921"} Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.297366 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.312496 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-log" containerID="cri-o://00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d" gracePeriod=30 Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.312963 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kncvx" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.313047 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kncvx" event={"ID":"82f30f7b-d441-4f72-aa2b-9fd450738e6d","Type":"ContainerDied","Data":"2a5569e09bfe2424de919839f64321547f2ec067a789794a4480ae4cbed63783"} Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.313080 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5569e09bfe2424de919839f64321547f2ec067a789794a4480ae4cbed63783" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.314198 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-api" containerID="cri-o://b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af" gracePeriod=30 Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.335964 4740 scope.go:117] "RemoveContainer" containerID="833b938ba95c6ead698b57005a7c07cca71da6bade145296713a10e9dc89b413" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.337181 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-8wzch"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.350146 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-8wzch"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.372022 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n62cf\" (UniqueName: \"kubernetes.io/projected/f4c2bc7d-e151-47df-8a00-e895b880cb93-kube-api-access-n62cf\") pod \"f4c2bc7d-e151-47df-8a00-e895b880cb93\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.372103 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-config-data\") pod \"f4c2bc7d-e151-47df-8a00-e895b880cb93\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.373520 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c2bc7d-e151-47df-8a00-e895b880cb93-logs\") pod \"f4c2bc7d-e151-47df-8a00-e895b880cb93\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.373561 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-combined-ca-bundle\") pod \"f4c2bc7d-e151-47df-8a00-e895b880cb93\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.375648 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4c2bc7d-e151-47df-8a00-e895b880cb93-logs" (OuterVolumeSpecName: "logs") pod "f4c2bc7d-e151-47df-8a00-e895b880cb93" (UID: "f4c2bc7d-e151-47df-8a00-e895b880cb93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.376600 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-nova-metadata-tls-certs\") pod \"f4c2bc7d-e151-47df-8a00-e895b880cb93\" (UID: \"f4c2bc7d-e151-47df-8a00-e895b880cb93\") " Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.377436 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c2bc7d-e151-47df-8a00-e895b880cb93-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.386619 4740 scope.go:117] "RemoveContainer" containerID="9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.386636 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c2bc7d-e151-47df-8a00-e895b880cb93-kube-api-access-n62cf" (OuterVolumeSpecName: "kube-api-access-n62cf") pod "f4c2bc7d-e151-47df-8a00-e895b880cb93" (UID: "f4c2bc7d-e151-47df-8a00-e895b880cb93"). InnerVolumeSpecName "kube-api-access-n62cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.387562 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 10:47:10 crc kubenswrapper[4740]: E1009 10:47:10.389149 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f30f7b-d441-4f72-aa2b-9fd450738e6d" containerName="nova-cell1-conductor-db-sync" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.389264 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f30f7b-d441-4f72-aa2b-9fd450738e6d" containerName="nova-cell1-conductor-db-sync" Oct 09 10:47:10 crc kubenswrapper[4740]: E1009 10:47:10.389371 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerName="nova-metadata-metadata" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.389728 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerName="nova-metadata-metadata" Oct 09 10:47:10 crc kubenswrapper[4740]: E1009 10:47:10.389840 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerName="nova-metadata-log" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.389909 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerName="nova-metadata-log" Oct 09 10:47:10 crc kubenswrapper[4740]: E1009 10:47:10.389998 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c511056-3872-4b33-92b4-51ff3fa0d287" containerName="init" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.390223 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c511056-3872-4b33-92b4-51ff3fa0d287" containerName="init" Oct 09 10:47:10 crc kubenswrapper[4740]: E1009 10:47:10.390324 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c511056-3872-4b33-92b4-51ff3fa0d287" containerName="dnsmasq-dns" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.390401 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c511056-3872-4b33-92b4-51ff3fa0d287" containerName="dnsmasq-dns" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.390877 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c511056-3872-4b33-92b4-51ff3fa0d287" containerName="dnsmasq-dns" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.390972 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerName="nova-metadata-metadata" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.391131 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" containerName="nova-metadata-log" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.391238 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f30f7b-d441-4f72-aa2b-9fd450738e6d" containerName="nova-cell1-conductor-db-sync" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.392897 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.398395 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.416831 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.428200 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4c2bc7d-e151-47df-8a00-e895b880cb93" (UID: "f4c2bc7d-e151-47df-8a00-e895b880cb93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.428818 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.441178 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-config-data" (OuterVolumeSpecName: "config-data") pod "f4c2bc7d-e151-47df-8a00-e895b880cb93" (UID: "f4c2bc7d-e151-47df-8a00-e895b880cb93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.462267 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f4c2bc7d-e151-47df-8a00-e895b880cb93" (UID: "f4c2bc7d-e151-47df-8a00-e895b880cb93"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.486910 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.486968 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.487032 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzz5\" (UniqueName: \"kubernetes.io/projected/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-kube-api-access-lvzz5\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.487119 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.487145 4740 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.487153 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n62cf\" (UniqueName: \"kubernetes.io/projected/f4c2bc7d-e151-47df-8a00-e895b880cb93-kube-api-access-n62cf\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.487161 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c2bc7d-e151-47df-8a00-e895b880cb93-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.548702 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.548983 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="ceilometer-central-agent" containerID="cri-o://8ca2c805a48391eb7f17c9b40203acd90f090fa047d96f13d08cc43e18cbd74e" gracePeriod=30 Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.549084 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="proxy-httpd" containerID="cri-o://8c7cb8aeb4be3d42b8ec3c91ff50df0195850a4e5c40bfb5ae6ffd304a47be1e" gracePeriod=30 Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.549123 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="sg-core" containerID="cri-o://4e048d1a005aaeea374f9d6537232f6ec218752cea298ff87443a25582513bcb" gracePeriod=30 Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.549159 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="ceilometer-notification-agent" containerID="cri-o://2427dc837e7ecd3f596e1b8961bb99f609986268b60d39d0bc7dabc880589b77" gracePeriod=30 Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.554965 4740 scope.go:117] "RemoveContainer" containerID="22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.585380 4740 scope.go:117] "RemoveContainer" containerID="9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415" Oct 09 10:47:10 crc kubenswrapper[4740]: E1009 10:47:10.588879 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415\": container with ID starting with 9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415 not found: ID does not exist" containerID="9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.588910 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415"} err="failed to get container status \"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415\": rpc error: code = NotFound desc = could not find container \"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415\": container with ID starting with 9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415 not found: ID does not exist" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.588929 4740 scope.go:117] "RemoveContainer" containerID="22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb" Oct 09 10:47:10 crc kubenswrapper[4740]: E1009 10:47:10.589583 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb\": container with ID starting with 22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb not found: ID does not exist" containerID="22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.590004 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb"} err="failed to get container status \"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb\": rpc error: code = NotFound desc = could not find container \"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb\": container with ID starting with 22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb not found: ID does not exist" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.590101 4740 scope.go:117] "RemoveContainer" containerID="9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.590564 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415"} err="failed to get container status \"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415\": rpc error: code = NotFound desc = could not find container \"9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415\": container with ID starting with 9881058e273ff05639310d5f9de24eeee90157a3d896f0d0f90646d4f74f6415 not found: ID does not exist" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.590613 4740 scope.go:117] "RemoveContainer" containerID="22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.590867 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzz5\" (UniqueName: \"kubernetes.io/projected/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-kube-api-access-lvzz5\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.591005 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb"} err="failed to get container status \"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb\": rpc error: code = NotFound desc = could not find container \"22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb\": container with ID starting with 22f763f9c412d0803777b76db075f52fde9343064516b456422797eb6534cadb not found: ID does not exist" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.591101 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.591166 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.595812 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.597423 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.609847 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzz5\" (UniqueName: \"kubernetes.io/projected/c11f736a-8dcf-45d6-9f8d-7ff8866458fb-kube-api-access-lvzz5\") pod \"nova-cell1-conductor-0\" (UID: \"c11f736a-8dcf-45d6-9f8d-7ff8866458fb\") " pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.636827 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.654972 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.661662 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.663258 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.667560 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.667778 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.701389 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.797931 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-config-data\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.798185 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.798257 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.798356 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rkf\" (UniqueName: \"kubernetes.io/projected/c44a55b0-facb-44d5-8343-f2274cc5171d-kube-api-access-56rkf\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.798478 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44a55b0-facb-44d5-8343-f2274cc5171d-logs\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.840826 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.900229 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44a55b0-facb-44d5-8343-f2274cc5171d-logs\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.900363 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-config-data\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.900687 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44a55b0-facb-44d5-8343-f2274cc5171d-logs\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.901554 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.901588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.901792 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56rkf\" (UniqueName: \"kubernetes.io/projected/c44a55b0-facb-44d5-8343-f2274cc5171d-kube-api-access-56rkf\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.905423 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.905503 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-config-data\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.905550 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:10 crc kubenswrapper[4740]: I1009 10:47:10.924394 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rkf\" (UniqueName: \"kubernetes.io/projected/c44a55b0-facb-44d5-8343-f2274cc5171d-kube-api-access-56rkf\") pod \"nova-metadata-0\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " pod="openstack/nova-metadata-0" Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.048393 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.143079 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.372494 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerID="8c7cb8aeb4be3d42b8ec3c91ff50df0195850a4e5c40bfb5ae6ffd304a47be1e" exitCode=0 Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.372783 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerID="4e048d1a005aaeea374f9d6537232f6ec218752cea298ff87443a25582513bcb" exitCode=2 Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.372798 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerID="8ca2c805a48391eb7f17c9b40203acd90f090fa047d96f13d08cc43e18cbd74e" exitCode=0 Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.372848 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerDied","Data":"8c7cb8aeb4be3d42b8ec3c91ff50df0195850a4e5c40bfb5ae6ffd304a47be1e"} Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.372881 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerDied","Data":"4e048d1a005aaeea374f9d6537232f6ec218752cea298ff87443a25582513bcb"} Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.372895 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerDied","Data":"8ca2c805a48391eb7f17c9b40203acd90f090fa047d96f13d08cc43e18cbd74e"} Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.405623 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"36a3a627-eea7-4034-a615-38c388851e07","Type":"ContainerStarted","Data":"d0739ffb8a814b106f8904519fa35c5fc198d10edc1316004a692f63f3db447c"} Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.405662 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"36a3a627-eea7-4034-a615-38c388851e07","Type":"ContainerStarted","Data":"17a5e685867f537ae63f51b4aa521298b226f471c51e6e6a0aa44c7672df1196"} Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.405683 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.429619 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.026778629 podStartE2EDuration="2.429603106s" podCreationTimestamp="2025-10-09 10:47:09 +0000 UTC" firstStartedPulling="2025-10-09 10:47:10.420366615 +0000 UTC m=+1169.382566996" lastFinishedPulling="2025-10-09 10:47:10.823191092 +0000 UTC m=+1169.785391473" observedRunningTime="2025-10-09 10:47:11.424326782 +0000 UTC m=+1170.386527163" watchObservedRunningTime="2025-10-09 10:47:11.429603106 +0000 UTC m=+1170.391803487" Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.438092 4740 generic.go:334] "Generic (PLEG): container finished" podID="32448d24-361d-4fbd-934b-404da232f445" containerID="00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d" exitCode=143 Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.438190 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32448d24-361d-4fbd-934b-404da232f445","Type":"ContainerDied","Data":"00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d"} Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.462184 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c11f736a-8dcf-45d6-9f8d-7ff8866458fb","Type":"ContainerStarted","Data":"363412b7d6f8a1d03922300be0e8c8c7bd5fdc8543535c36d08f304e809ea3fb"} Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.462475 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c617a5ac-a683-46f1-989d-d3508405577a" containerName="nova-scheduler-scheduler" containerID="cri-o://5c4a279758c20069c890cfb4ff4efcdb45c5a6fb92ef0c79f92b563c8af3a2fb" gracePeriod=30 Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.611479 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.766978 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c511056-3872-4b33-92b4-51ff3fa0d287" path="/var/lib/kubelet/pods/6c511056-3872-4b33-92b4-51ff3fa0d287/volumes" Oct 09 10:47:11 crc kubenswrapper[4740]: I1009 10:47:11.768320 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c2bc7d-e151-47df-8a00-e895b880cb93" path="/var/lib/kubelet/pods/f4c2bc7d-e151-47df-8a00-e895b880cb93/volumes" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.472577 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerID="2427dc837e7ecd3f596e1b8961bb99f609986268b60d39d0bc7dabc880589b77" exitCode=0 Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.472937 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.472808 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerDied","Data":"2427dc837e7ecd3f596e1b8961bb99f609986268b60d39d0bc7dabc880589b77"} Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.472977 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed1a4238-024e-420e-9848-cd048fdd24f3","Type":"ContainerDied","Data":"8bb7382ee511ac977716b7468417b3abce394f0a99923c532789930b6f01c20c"} Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.473000 4740 scope.go:117] "RemoveContainer" containerID="8c7cb8aeb4be3d42b8ec3c91ff50df0195850a4e5c40bfb5ae6ffd304a47be1e" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.477846 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c44a55b0-facb-44d5-8343-f2274cc5171d","Type":"ContainerStarted","Data":"b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538"} Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.477876 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c44a55b0-facb-44d5-8343-f2274cc5171d","Type":"ContainerStarted","Data":"052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752"} Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.477885 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c44a55b0-facb-44d5-8343-f2274cc5171d","Type":"ContainerStarted","Data":"b4d43e2fc764aba65b513260a7cb2b75b29f101ed22f5408675bf501c020be95"} Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.481350 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c11f736a-8dcf-45d6-9f8d-7ff8866458fb","Type":"ContainerStarted","Data":"e349a5d92d7a16c3333c54deaef8ef39f7a3b31c3d1168ceb0a99761883ca7d7"} Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.481537 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.508881 4740 scope.go:117] "RemoveContainer" containerID="4e048d1a005aaeea374f9d6537232f6ec218752cea298ff87443a25582513bcb" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.524490 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.52446949 podStartE2EDuration="2.52446949s" podCreationTimestamp="2025-10-09 10:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:12.522578288 +0000 UTC m=+1171.484778689" watchObservedRunningTime="2025-10-09 10:47:12.52446949 +0000 UTC m=+1171.486669871" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.545907 4740 scope.go:117] "RemoveContainer" containerID="2427dc837e7ecd3f596e1b8961bb99f609986268b60d39d0bc7dabc880589b77" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.547093 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.547083068 podStartE2EDuration="2.547083068s" podCreationTimestamp="2025-10-09 10:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:12.546351688 +0000 UTC m=+1171.508552069" watchObservedRunningTime="2025-10-09 10:47:12.547083068 +0000 UTC m=+1171.509283449" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.572252 4740 scope.go:117] "RemoveContainer" containerID="8ca2c805a48391eb7f17c9b40203acd90f090fa047d96f13d08cc43e18cbd74e" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.572892 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-run-httpd\") pod \"ed1a4238-024e-420e-9848-cd048fdd24f3\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.572934 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-log-httpd\") pod \"ed1a4238-024e-420e-9848-cd048fdd24f3\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.572974 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-combined-ca-bundle\") pod \"ed1a4238-024e-420e-9848-cd048fdd24f3\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.573001 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klmhb\" (UniqueName: \"kubernetes.io/projected/ed1a4238-024e-420e-9848-cd048fdd24f3-kube-api-access-klmhb\") pod \"ed1a4238-024e-420e-9848-cd048fdd24f3\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.573037 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-config-data\") pod \"ed1a4238-024e-420e-9848-cd048fdd24f3\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.573099 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-scripts\") pod \"ed1a4238-024e-420e-9848-cd048fdd24f3\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.573255 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-sg-core-conf-yaml\") pod \"ed1a4238-024e-420e-9848-cd048fdd24f3\" (UID: \"ed1a4238-024e-420e-9848-cd048fdd24f3\") " Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.575123 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed1a4238-024e-420e-9848-cd048fdd24f3" (UID: "ed1a4238-024e-420e-9848-cd048fdd24f3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.577397 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed1a4238-024e-420e-9848-cd048fdd24f3" (UID: "ed1a4238-024e-420e-9848-cd048fdd24f3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.586482 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-scripts" (OuterVolumeSpecName: "scripts") pod "ed1a4238-024e-420e-9848-cd048fdd24f3" (UID: "ed1a4238-024e-420e-9848-cd048fdd24f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.598020 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1a4238-024e-420e-9848-cd048fdd24f3-kube-api-access-klmhb" (OuterVolumeSpecName: "kube-api-access-klmhb") pod "ed1a4238-024e-420e-9848-cd048fdd24f3" (UID: "ed1a4238-024e-420e-9848-cd048fdd24f3"). InnerVolumeSpecName "kube-api-access-klmhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.608933 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed1a4238-024e-420e-9848-cd048fdd24f3" (UID: "ed1a4238-024e-420e-9848-cd048fdd24f3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.676116 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.676227 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.676243 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed1a4238-024e-420e-9848-cd048fdd24f3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.677023 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klmhb\" (UniqueName: \"kubernetes.io/projected/ed1a4238-024e-420e-9848-cd048fdd24f3-kube-api-access-klmhb\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.677047 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.681429 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed1a4238-024e-420e-9848-cd048fdd24f3" (UID: "ed1a4238-024e-420e-9848-cd048fdd24f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.708955 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-config-data" (OuterVolumeSpecName: "config-data") pod "ed1a4238-024e-420e-9848-cd048fdd24f3" (UID: "ed1a4238-024e-420e-9848-cd048fdd24f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.778565 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:12 crc kubenswrapper[4740]: I1009 10:47:12.778594 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1a4238-024e-420e-9848-cd048fdd24f3-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.490481 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.536807 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.541572 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.569619 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:13 crc kubenswrapper[4740]: E1009 10:47:13.570058 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="ceilometer-notification-agent" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.570079 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="ceilometer-notification-agent" Oct 09 10:47:13 crc kubenswrapper[4740]: E1009 10:47:13.570097 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="sg-core" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.570106 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="sg-core" Oct 09 10:47:13 crc kubenswrapper[4740]: E1009 10:47:13.570134 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="proxy-httpd" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.570147 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="proxy-httpd" Oct 09 10:47:13 crc kubenswrapper[4740]: E1009 10:47:13.570186 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="ceilometer-central-agent" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.570198 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="ceilometer-central-agent" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.570419 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="ceilometer-notification-agent" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.570445 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="ceilometer-central-agent" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.570464 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="sg-core" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.570482 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" containerName="proxy-httpd" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.572545 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.578422 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.578526 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.582320 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.589617 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.590178 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.590452 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.590625 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.590849 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r2wn\" (UniqueName: \"kubernetes.io/projected/09533833-0903-487b-9963-e36425a64e8a-kube-api-access-6r2wn\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.590955 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.591599 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-scripts\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.591656 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-config-data\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.591770 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: E1009 10:47:13.681078 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c4a279758c20069c890cfb4ff4efcdb45c5a6fb92ef0c79f92b563c8af3a2fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 10:47:13 crc kubenswrapper[4740]: E1009 10:47:13.682719 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c4a279758c20069c890cfb4ff4efcdb45c5a6fb92ef0c79f92b563c8af3a2fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 10:47:13 crc kubenswrapper[4740]: E1009 10:47:13.684038 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c4a279758c20069c890cfb4ff4efcdb45c5a6fb92ef0c79f92b563c8af3a2fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 10:47:13 crc kubenswrapper[4740]: E1009 10:47:13.684079 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c617a5ac-a683-46f1-989d-d3508405577a" containerName="nova-scheduler-scheduler" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.694641 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.694048 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.694871 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r2wn\" (UniqueName: \"kubernetes.io/projected/09533833-0903-487b-9963-e36425a64e8a-kube-api-access-6r2wn\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.695436 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.696355 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-scripts\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.696405 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-config-data\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.696450 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.696509 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.696777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.697304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.700473 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-scripts\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.701311 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.707536 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.708167 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-config-data\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.710164 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.715982 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r2wn\" (UniqueName: \"kubernetes.io/projected/09533833-0903-487b-9963-e36425a64e8a-kube-api-access-6r2wn\") pod \"ceilometer-0\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " pod="openstack/ceilometer-0" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.769208 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1a4238-024e-420e-9848-cd048fdd24f3" path="/var/lib/kubelet/pods/ed1a4238-024e-420e-9848-cd048fdd24f3/volumes" Oct 09 10:47:13 crc kubenswrapper[4740]: I1009 10:47:13.894686 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.434220 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:14 crc kubenswrapper[4740]: W1009 10:47:14.489743 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09533833_0903_487b_9963_e36425a64e8a.slice/crio-1b5427dcbfa95a34de33e513a751a3441a412c3889cb1c64c874f9cf4c3f45b2 WatchSource:0}: Error finding container 1b5427dcbfa95a34de33e513a751a3441a412c3889cb1c64c874f9cf4c3f45b2: Status 404 returned error can't find the container with id 1b5427dcbfa95a34de33e513a751a3441a412c3889cb1c64c874f9cf4c3f45b2 Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.504702 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerStarted","Data":"1b5427dcbfa95a34de33e513a751a3441a412c3889cb1c64c874f9cf4c3f45b2"} Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.508123 4740 generic.go:334] "Generic (PLEG): container finished" podID="c617a5ac-a683-46f1-989d-d3508405577a" containerID="5c4a279758c20069c890cfb4ff4efcdb45c5a6fb92ef0c79f92b563c8af3a2fb" exitCode=0 Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.508188 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c617a5ac-a683-46f1-989d-d3508405577a","Type":"ContainerDied","Data":"5c4a279758c20069c890cfb4ff4efcdb45c5a6fb92ef0c79f92b563c8af3a2fb"} Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.850693 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.923392 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-combined-ca-bundle\") pod \"c617a5ac-a683-46f1-989d-d3508405577a\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.923502 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q7ns\" (UniqueName: \"kubernetes.io/projected/c617a5ac-a683-46f1-989d-d3508405577a-kube-api-access-7q7ns\") pod \"c617a5ac-a683-46f1-989d-d3508405577a\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.923669 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-config-data\") pod \"c617a5ac-a683-46f1-989d-d3508405577a\" (UID: \"c617a5ac-a683-46f1-989d-d3508405577a\") " Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.933042 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c617a5ac-a683-46f1-989d-d3508405577a-kube-api-access-7q7ns" (OuterVolumeSpecName: "kube-api-access-7q7ns") pod "c617a5ac-a683-46f1-989d-d3508405577a" (UID: "c617a5ac-a683-46f1-989d-d3508405577a"). InnerVolumeSpecName "kube-api-access-7q7ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.951370 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c617a5ac-a683-46f1-989d-d3508405577a" (UID: "c617a5ac-a683-46f1-989d-d3508405577a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:14 crc kubenswrapper[4740]: I1009 10:47:14.953426 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-config-data" (OuterVolumeSpecName: "config-data") pod "c617a5ac-a683-46f1-989d-d3508405577a" (UID: "c617a5ac-a683-46f1-989d-d3508405577a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.025986 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.026296 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c617a5ac-a683-46f1-989d-d3508405577a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.026308 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q7ns\" (UniqueName: \"kubernetes.io/projected/c617a5ac-a683-46f1-989d-d3508405577a-kube-api-access-7q7ns\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.518818 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.518823 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c617a5ac-a683-46f1-989d-d3508405577a","Type":"ContainerDied","Data":"7cfcb8bde60886609043a9d1e0172b84f88a93d9772aeaf071b56cabd480f6ff"} Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.519361 4740 scope.go:117] "RemoveContainer" containerID="5c4a279758c20069c890cfb4ff4efcdb45c5a6fb92ef0c79f92b563c8af3a2fb" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.522137 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerStarted","Data":"28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2"} Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.568842 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.570767 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.586959 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:15 crc kubenswrapper[4740]: E1009 10:47:15.587574 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c617a5ac-a683-46f1-989d-d3508405577a" containerName="nova-scheduler-scheduler" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.587598 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c617a5ac-a683-46f1-989d-d3508405577a" containerName="nova-scheduler-scheduler" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.587959 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c617a5ac-a683-46f1-989d-d3508405577a" containerName="nova-scheduler-scheduler" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.588738 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.591598 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.601622 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.636440 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-config-data\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.636572 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.636743 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wfk\" (UniqueName: \"kubernetes.io/projected/ef87ed9c-12c3-4ee4-9011-a826acaab478-kube-api-access-s7wfk\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.738552 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wfk\" (UniqueName: \"kubernetes.io/projected/ef87ed9c-12c3-4ee4-9011-a826acaab478-kube-api-access-s7wfk\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.738670 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-config-data\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.738714 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.746047 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.757506 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-config-data\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.762587 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wfk\" (UniqueName: \"kubernetes.io/projected/ef87ed9c-12c3-4ee4-9011-a826acaab478-kube-api-access-s7wfk\") pod \"nova-scheduler-0\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.785900 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c617a5ac-a683-46f1-989d-d3508405577a" path="/var/lib/kubelet/pods/c617a5ac-a683-46f1-989d-d3508405577a/volumes" Oct 09 10:47:15 crc kubenswrapper[4740]: I1009 10:47:15.944028 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.050359 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.050644 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.085358 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.144679 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-combined-ca-bundle\") pod \"32448d24-361d-4fbd-934b-404da232f445\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.145449 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmjlj\" (UniqueName: \"kubernetes.io/projected/32448d24-361d-4fbd-934b-404da232f445-kube-api-access-fmjlj\") pod \"32448d24-361d-4fbd-934b-404da232f445\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.145579 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32448d24-361d-4fbd-934b-404da232f445-logs\") pod \"32448d24-361d-4fbd-934b-404da232f445\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.145623 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-config-data\") pod \"32448d24-361d-4fbd-934b-404da232f445\" (UID: \"32448d24-361d-4fbd-934b-404da232f445\") " Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.147037 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32448d24-361d-4fbd-934b-404da232f445-logs" (OuterVolumeSpecName: "logs") pod "32448d24-361d-4fbd-934b-404da232f445" (UID: "32448d24-361d-4fbd-934b-404da232f445"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.150895 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32448d24-361d-4fbd-934b-404da232f445-kube-api-access-fmjlj" (OuterVolumeSpecName: "kube-api-access-fmjlj") pod "32448d24-361d-4fbd-934b-404da232f445" (UID: "32448d24-361d-4fbd-934b-404da232f445"). InnerVolumeSpecName "kube-api-access-fmjlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.177578 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-config-data" (OuterVolumeSpecName: "config-data") pod "32448d24-361d-4fbd-934b-404da232f445" (UID: "32448d24-361d-4fbd-934b-404da232f445"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.178406 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32448d24-361d-4fbd-934b-404da232f445" (UID: "32448d24-361d-4fbd-934b-404da232f445"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.248104 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32448d24-361d-4fbd-934b-404da232f445-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.248133 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.248159 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32448d24-361d-4fbd-934b-404da232f445-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.248169 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmjlj\" (UniqueName: \"kubernetes.io/projected/32448d24-361d-4fbd-934b-404da232f445-kube-api-access-fmjlj\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.399163 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:16 crc kubenswrapper[4740]: W1009 10:47:16.400806 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef87ed9c_12c3_4ee4_9011_a826acaab478.slice/crio-b35c45ab00c4c668ba93e95aef056125f07d1e7cb0470de6eb96cbee56829528 WatchSource:0}: Error finding container b35c45ab00c4c668ba93e95aef056125f07d1e7cb0470de6eb96cbee56829528: Status 404 returned error can't find the container with id b35c45ab00c4c668ba93e95aef056125f07d1e7cb0470de6eb96cbee56829528 Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.532791 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerStarted","Data":"0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12"} Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.535443 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef87ed9c-12c3-4ee4-9011-a826acaab478","Type":"ContainerStarted","Data":"b35c45ab00c4c668ba93e95aef056125f07d1e7cb0470de6eb96cbee56829528"} Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.537985 4740 generic.go:334] "Generic (PLEG): container finished" podID="32448d24-361d-4fbd-934b-404da232f445" containerID="b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af" exitCode=0 Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.538083 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32448d24-361d-4fbd-934b-404da232f445","Type":"ContainerDied","Data":"b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af"} Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.538115 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.538133 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32448d24-361d-4fbd-934b-404da232f445","Type":"ContainerDied","Data":"eb71b4131c4d8dbc6dc5646a33e6b58e7470d3dc657607a873c6c21c5c0b064d"} Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.538152 4740 scope.go:117] "RemoveContainer" containerID="b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.591059 4740 scope.go:117] "RemoveContainer" containerID="00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.609246 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.616283 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.617383 4740 scope.go:117] "RemoveContainer" containerID="b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af" Oct 09 10:47:16 crc kubenswrapper[4740]: E1009 10:47:16.618262 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af\": container with ID starting with b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af not found: ID does not exist" containerID="b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.618323 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af"} err="failed to get container status \"b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af\": rpc error: code = NotFound desc = could not find container \"b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af\": container with ID starting with b77c935f95bf3f79531009437baa9de09731613a5056dc28e2f0e597da49b6af not found: ID does not exist" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.618364 4740 scope.go:117] "RemoveContainer" containerID="00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d" Oct 09 10:47:16 crc kubenswrapper[4740]: E1009 10:47:16.618873 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d\": container with ID starting with 00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d not found: ID does not exist" containerID="00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.618901 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d"} err="failed to get container status \"00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d\": rpc error: code = NotFound desc = could not find container \"00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d\": container with ID starting with 00e5de6482d861bb7ba21651265383ae381e1dfc14c291e83326e1a5ef90605d not found: ID does not exist" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.632300 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:16 crc kubenswrapper[4740]: E1009 10:47:16.632830 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-api" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.632854 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-api" Oct 09 10:47:16 crc kubenswrapper[4740]: E1009 10:47:16.632882 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-log" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.632889 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-log" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.633092 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-api" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.633134 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="32448d24-361d-4fbd-934b-404da232f445" containerName="nova-api-log" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.634207 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.636550 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.651578 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.654946 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhfxt\" (UniqueName: \"kubernetes.io/projected/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-kube-api-access-bhfxt\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.655001 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.655182 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-logs\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.655256 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-config-data\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.757563 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-logs\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.757997 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-config-data\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.758079 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhfxt\" (UniqueName: \"kubernetes.io/projected/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-kube-api-access-bhfxt\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.758111 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.758121 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-logs\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.765573 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.766085 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-config-data\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.774507 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhfxt\" (UniqueName: \"kubernetes.io/projected/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-kube-api-access-bhfxt\") pod \"nova-api-0\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " pod="openstack/nova-api-0" Oct 09 10:47:16 crc kubenswrapper[4740]: I1009 10:47:16.951960 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:17 crc kubenswrapper[4740]: I1009 10:47:17.443406 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:17 crc kubenswrapper[4740]: I1009 10:47:17.560935 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade","Type":"ContainerStarted","Data":"913009c408883c40606aec6a04f9d4ffdc45a021a6c6298139dd90d6fcbd5db1"} Oct 09 10:47:17 crc kubenswrapper[4740]: I1009 10:47:17.567357 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerStarted","Data":"9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda"} Oct 09 10:47:17 crc kubenswrapper[4740]: I1009 10:47:17.573194 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef87ed9c-12c3-4ee4-9011-a826acaab478","Type":"ContainerStarted","Data":"7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59"} Oct 09 10:47:17 crc kubenswrapper[4740]: I1009 10:47:17.593627 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.593611125 podStartE2EDuration="2.593611125s" podCreationTimestamp="2025-10-09 10:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:17.589696908 +0000 UTC m=+1176.551897309" watchObservedRunningTime="2025-10-09 10:47:17.593611125 +0000 UTC m=+1176.555811506" Oct 09 10:47:17 crc kubenswrapper[4740]: I1009 10:47:17.764475 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32448d24-361d-4fbd-934b-404da232f445" path="/var/lib/kubelet/pods/32448d24-361d-4fbd-934b-404da232f445/volumes" Oct 09 10:47:18 crc kubenswrapper[4740]: I1009 10:47:18.588270 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade","Type":"ContainerStarted","Data":"aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d"} Oct 09 10:47:18 crc kubenswrapper[4740]: I1009 10:47:18.588815 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade","Type":"ContainerStarted","Data":"1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8"} Oct 09 10:47:18 crc kubenswrapper[4740]: I1009 10:47:18.612374 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.612356325 podStartE2EDuration="2.612356325s" podCreationTimestamp="2025-10-09 10:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:18.605260441 +0000 UTC m=+1177.567460812" watchObservedRunningTime="2025-10-09 10:47:18.612356325 +0000 UTC m=+1177.574556706" Oct 09 10:47:19 crc kubenswrapper[4740]: I1009 10:47:19.599648 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerStarted","Data":"804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873"} Oct 09 10:47:19 crc kubenswrapper[4740]: I1009 10:47:19.600118 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 10:47:19 crc kubenswrapper[4740]: I1009 10:47:19.625781 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2862620319999998 podStartE2EDuration="6.62573845s" podCreationTimestamp="2025-10-09 10:47:13 +0000 UTC" firstStartedPulling="2025-10-09 10:47:14.493986534 +0000 UTC m=+1173.456186925" lastFinishedPulling="2025-10-09 10:47:18.833462932 +0000 UTC m=+1177.795663343" observedRunningTime="2025-10-09 10:47:19.621164635 +0000 UTC m=+1178.583365016" watchObservedRunningTime="2025-10-09 10:47:19.62573845 +0000 UTC m=+1178.587938831" Oct 09 10:47:19 crc kubenswrapper[4740]: I1009 10:47:19.838398 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 10:47:20 crc kubenswrapper[4740]: I1009 10:47:20.865534 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 09 10:47:20 crc kubenswrapper[4740]: I1009 10:47:20.945035 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 10:47:21 crc kubenswrapper[4740]: I1009 10:47:21.050434 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 10:47:21 crc kubenswrapper[4740]: I1009 10:47:21.050504 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 10:47:22 crc kubenswrapper[4740]: I1009 10:47:22.065010 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 10:47:22 crc kubenswrapper[4740]: I1009 10:47:22.065032 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 10:47:25 crc kubenswrapper[4740]: I1009 10:47:25.944430 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 10:47:25 crc kubenswrapper[4740]: I1009 10:47:25.984602 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 10:47:26 crc kubenswrapper[4740]: I1009 10:47:26.706312 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 10:47:26 crc kubenswrapper[4740]: I1009 10:47:26.952728 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 10:47:26 crc kubenswrapper[4740]: I1009 10:47:26.952788 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 10:47:28 crc kubenswrapper[4740]: I1009 10:47:28.036004 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 10:47:28 crc kubenswrapper[4740]: I1009 10:47:28.036011 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 10:47:31 crc kubenswrapper[4740]: I1009 10:47:31.055445 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 10:47:31 crc kubenswrapper[4740]: I1009 10:47:31.059556 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 10:47:31 crc kubenswrapper[4740]: I1009 10:47:31.066144 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 10:47:31 crc kubenswrapper[4740]: I1009 10:47:31.740739 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.737538 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.757844 4740 generic.go:334] "Generic (PLEG): container finished" podID="d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" containerID="4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7" exitCode=137 Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.759036 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.788374 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d","Type":"ContainerDied","Data":"4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7"} Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.788431 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d","Type":"ContainerDied","Data":"6023281e53d3e1d9f46ca49a4af0620bbc2a17469f60898cf65a31292b072544"} Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.788453 4740 scope.go:117] "RemoveContainer" containerID="4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7" Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.811897 4740 scope.go:117] "RemoveContainer" containerID="4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7" Oct 09 10:47:33 crc kubenswrapper[4740]: E1009 10:47:33.812621 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7\": container with ID starting with 4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7 not found: ID does not exist" containerID="4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7" Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.812703 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7"} err="failed to get container status \"4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7\": rpc error: code = NotFound desc = could not find container \"4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7\": container with ID starting with 4f19228943326be7b4738c62800f8f8943969bec456e826d1d8cc934c64ecea7 not found: ID does not exist" Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.911570 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8679x\" (UniqueName: \"kubernetes.io/projected/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-kube-api-access-8679x\") pod \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.911694 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-combined-ca-bundle\") pod \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.912113 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-config-data\") pod \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\" (UID: \"d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d\") " Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.920263 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-kube-api-access-8679x" (OuterVolumeSpecName: "kube-api-access-8679x") pod "d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" (UID: "d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d"). InnerVolumeSpecName "kube-api-access-8679x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.948352 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-config-data" (OuterVolumeSpecName: "config-data") pod "d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" (UID: "d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:33 crc kubenswrapper[4740]: I1009 10:47:33.949892 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" (UID: "d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.013514 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.013544 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8679x\" (UniqueName: \"kubernetes.io/projected/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-kube-api-access-8679x\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.013553 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.091695 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.097311 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.117130 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:47:34 crc kubenswrapper[4740]: E1009 10:47:34.117511 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.117529 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.117774 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.118356 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.120500 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.146434 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.120880 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.121464 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.217078 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4t9g\" (UniqueName: \"kubernetes.io/projected/5e2d4c31-bba4-46d5-8119-b0970e10437d-kube-api-access-b4t9g\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.217164 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.217199 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.217431 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.217522 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.319366 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4t9g\" (UniqueName: \"kubernetes.io/projected/5e2d4c31-bba4-46d5-8119-b0970e10437d-kube-api-access-b4t9g\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.319438 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.319466 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.319517 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.319540 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.322865 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.322970 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.323561 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.329638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2d4c31-bba4-46d5-8119-b0970e10437d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.337618 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4t9g\" (UniqueName: \"kubernetes.io/projected/5e2d4c31-bba4-46d5-8119-b0970e10437d-kube-api-access-b4t9g\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e2d4c31-bba4-46d5-8119-b0970e10437d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.438088 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:34 crc kubenswrapper[4740]: I1009 10:47:34.871176 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 10:47:34 crc kubenswrapper[4740]: W1009 10:47:34.881025 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2d4c31_bba4_46d5_8119_b0970e10437d.slice/crio-c39a54295094e08c20eb763ca40c93e7c37c70a307f34e74ed102ca73030f3ad WatchSource:0}: Error finding container c39a54295094e08c20eb763ca40c93e7c37c70a307f34e74ed102ca73030f3ad: Status 404 returned error can't find the container with id c39a54295094e08c20eb763ca40c93e7c37c70a307f34e74ed102ca73030f3ad Oct 09 10:47:35 crc kubenswrapper[4740]: I1009 10:47:35.767933 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d" path="/var/lib/kubelet/pods/d09ad294-afe9-4f1a-b1b3-ed313e6e5d0d/volumes" Oct 09 10:47:35 crc kubenswrapper[4740]: I1009 10:47:35.784047 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e2d4c31-bba4-46d5-8119-b0970e10437d","Type":"ContainerStarted","Data":"a7ccc6b7d742bb7c08152037a867efb9ffaa0fe66e089ff9a5e11e40bf3c2c72"} Oct 09 10:47:35 crc kubenswrapper[4740]: I1009 10:47:35.784102 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e2d4c31-bba4-46d5-8119-b0970e10437d","Type":"ContainerStarted","Data":"c39a54295094e08c20eb763ca40c93e7c37c70a307f34e74ed102ca73030f3ad"} Oct 09 10:47:35 crc kubenswrapper[4740]: I1009 10:47:35.812175 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.8121545289999998 podStartE2EDuration="1.812154529s" podCreationTimestamp="2025-10-09 10:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:35.805729604 +0000 UTC m=+1194.767929995" watchObservedRunningTime="2025-10-09 10:47:35.812154529 +0000 UTC m=+1194.774354930" Oct 09 10:47:36 crc kubenswrapper[4740]: I1009 10:47:36.956651 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 10:47:36 crc kubenswrapper[4740]: I1009 10:47:36.957183 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 10:47:36 crc kubenswrapper[4740]: I1009 10:47:36.958504 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 10:47:36 crc kubenswrapper[4740]: I1009 10:47:36.963786 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 10:47:37 crc kubenswrapper[4740]: I1009 10:47:37.801890 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 10:47:37 crc kubenswrapper[4740]: I1009 10:47:37.804802 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 10:47:37 crc kubenswrapper[4740]: I1009 10:47:37.955158 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-6h6zr"] Oct 09 10:47:37 crc kubenswrapper[4740]: I1009 10:47:37.959845 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:37 crc kubenswrapper[4740]: I1009 10:47:37.979016 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-6h6zr"] Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.091641 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.091795 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.091845 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-config\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.091891 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.091916 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmbvq\" (UniqueName: \"kubernetes.io/projected/bdee6391-3978-4f05-b3c6-a80276b6295f-kube-api-access-cmbvq\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.091981 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.194282 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-config\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.194363 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.194401 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmbvq\" (UniqueName: \"kubernetes.io/projected/bdee6391-3978-4f05-b3c6-a80276b6295f-kube-api-access-cmbvq\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.194444 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.194517 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.194596 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.195217 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-config\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.195482 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.196158 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.196195 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.196859 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.221554 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmbvq\" (UniqueName: \"kubernetes.io/projected/bdee6391-3978-4f05-b3c6-a80276b6295f-kube-api-access-cmbvq\") pod \"dnsmasq-dns-59cf4bdb65-6h6zr\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.296404 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:38 crc kubenswrapper[4740]: I1009 10:47:38.801958 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-6h6zr"] Oct 09 10:47:38 crc kubenswrapper[4740]: W1009 10:47:38.809221 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdee6391_3978_4f05_b3c6_a80276b6295f.slice/crio-014496efd3dba5e8e08537154d2b236b8419532dd5e728c517c1b695b0572e3e WatchSource:0}: Error finding container 014496efd3dba5e8e08537154d2b236b8419532dd5e728c517c1b695b0572e3e: Status 404 returned error can't find the container with id 014496efd3dba5e8e08537154d2b236b8419532dd5e728c517c1b695b0572e3e Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.439160 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.818479 4740 generic.go:334] "Generic (PLEG): container finished" podID="bdee6391-3978-4f05-b3c6-a80276b6295f" containerID="fc0fac6a7dd1576484c374447515d2b754afe9e28ec468a90f186817d13235a4" exitCode=0 Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.819980 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" event={"ID":"bdee6391-3978-4f05-b3c6-a80276b6295f","Type":"ContainerDied","Data":"fc0fac6a7dd1576484c374447515d2b754afe9e28ec468a90f186817d13235a4"} Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.820013 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" event={"ID":"bdee6391-3978-4f05-b3c6-a80276b6295f","Type":"ContainerStarted","Data":"014496efd3dba5e8e08537154d2b236b8419532dd5e728c517c1b695b0572e3e"} Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.823838 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.824115 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="ceilometer-central-agent" containerID="cri-o://28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2" gracePeriod=30 Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.824343 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="proxy-httpd" containerID="cri-o://804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873" gracePeriod=30 Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.824487 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="sg-core" containerID="cri-o://9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda" gracePeriod=30 Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.825373 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="ceilometer-notification-agent" containerID="cri-o://0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12" gracePeriod=30 Oct 09 10:47:39 crc kubenswrapper[4740]: I1009 10:47:39.854786 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.198:3000/\": EOF" Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.276217 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.829415 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" event={"ID":"bdee6391-3978-4f05-b3c6-a80276b6295f","Type":"ContainerStarted","Data":"6b41f3337bebd89d9b228c81cb1dfb6df151ad00f2a6e62c895aa024d1415b04"} Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.829925 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.832064 4740 generic.go:334] "Generic (PLEG): container finished" podID="09533833-0903-487b-9963-e36425a64e8a" containerID="804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873" exitCode=0 Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.832094 4740 generic.go:334] "Generic (PLEG): container finished" podID="09533833-0903-487b-9963-e36425a64e8a" containerID="9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda" exitCode=2 Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.832102 4740 generic.go:334] "Generic (PLEG): container finished" podID="09533833-0903-487b-9963-e36425a64e8a" containerID="28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2" exitCode=0 Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.832271 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-log" containerID="cri-o://1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8" gracePeriod=30 Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.832505 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerDied","Data":"804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873"} Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.832537 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerDied","Data":"9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda"} Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.832547 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerDied","Data":"28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2"} Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.832593 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-api" containerID="cri-o://aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d" gracePeriod=30 Oct 09 10:47:40 crc kubenswrapper[4740]: I1009 10:47:40.861253 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" podStartSLOduration=3.861230604 podStartE2EDuration="3.861230604s" podCreationTimestamp="2025-10-09 10:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:40.857469682 +0000 UTC m=+1199.819670103" watchObservedRunningTime="2025-10-09 10:47:40.861230604 +0000 UTC m=+1199.823431025" Oct 09 10:47:41 crc kubenswrapper[4740]: I1009 10:47:41.845058 4740 generic.go:334] "Generic (PLEG): container finished" podID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerID="1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8" exitCode=143 Oct 09 10:47:41 crc kubenswrapper[4740]: I1009 10:47:41.845194 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade","Type":"ContainerDied","Data":"1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8"} Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.751186 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.866728 4740 generic.go:334] "Generic (PLEG): container finished" podID="09533833-0903-487b-9963-e36425a64e8a" containerID="0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12" exitCode=0 Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.866809 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.866799 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerDied","Data":"0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12"} Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.866879 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09533833-0903-487b-9963-e36425a64e8a","Type":"ContainerDied","Data":"1b5427dcbfa95a34de33e513a751a3441a412c3889cb1c64c874f9cf4c3f45b2"} Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.866905 4740 scope.go:117] "RemoveContainer" containerID="804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.888844 4740 scope.go:117] "RemoveContainer" containerID="9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.900479 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-run-httpd\") pod \"09533833-0903-487b-9963-e36425a64e8a\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.900521 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-log-httpd\") pod \"09533833-0903-487b-9963-e36425a64e8a\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.900549 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-config-data\") pod \"09533833-0903-487b-9963-e36425a64e8a\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.900591 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-sg-core-conf-yaml\") pod \"09533833-0903-487b-9963-e36425a64e8a\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.900679 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-scripts\") pod \"09533833-0903-487b-9963-e36425a64e8a\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.900859 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-ceilometer-tls-certs\") pod \"09533833-0903-487b-9963-e36425a64e8a\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.900883 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-combined-ca-bundle\") pod \"09533833-0903-487b-9963-e36425a64e8a\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.900927 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "09533833-0903-487b-9963-e36425a64e8a" (UID: "09533833-0903-487b-9963-e36425a64e8a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.901298 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r2wn\" (UniqueName: \"kubernetes.io/projected/09533833-0903-487b-9963-e36425a64e8a-kube-api-access-6r2wn\") pod \"09533833-0903-487b-9963-e36425a64e8a\" (UID: \"09533833-0903-487b-9963-e36425a64e8a\") " Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.901687 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.902205 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "09533833-0903-487b-9963-e36425a64e8a" (UID: "09533833-0903-487b-9963-e36425a64e8a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.906369 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09533833-0903-487b-9963-e36425a64e8a-kube-api-access-6r2wn" (OuterVolumeSpecName: "kube-api-access-6r2wn") pod "09533833-0903-487b-9963-e36425a64e8a" (UID: "09533833-0903-487b-9963-e36425a64e8a"). InnerVolumeSpecName "kube-api-access-6r2wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.923509 4740 scope.go:117] "RemoveContainer" containerID="0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.925199 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-scripts" (OuterVolumeSpecName: "scripts") pod "09533833-0903-487b-9963-e36425a64e8a" (UID: "09533833-0903-487b-9963-e36425a64e8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.955981 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "09533833-0903-487b-9963-e36425a64e8a" (UID: "09533833-0903-487b-9963-e36425a64e8a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.973349 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "09533833-0903-487b-9963-e36425a64e8a" (UID: "09533833-0903-487b-9963-e36425a64e8a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:43 crc kubenswrapper[4740]: I1009 10:47:43.989239 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09533833-0903-487b-9963-e36425a64e8a" (UID: "09533833-0903-487b-9963-e36425a64e8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.003145 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.003188 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.003198 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r2wn\" (UniqueName: \"kubernetes.io/projected/09533833-0903-487b-9963-e36425a64e8a-kube-api-access-6r2wn\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.003207 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09533833-0903-487b-9963-e36425a64e8a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.003214 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.003222 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.034955 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-config-data" (OuterVolumeSpecName: "config-data") pod "09533833-0903-487b-9963-e36425a64e8a" (UID: "09533833-0903-487b-9963-e36425a64e8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.105526 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09533833-0903-487b-9963-e36425a64e8a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.183090 4740 scope.go:117] "RemoveContainer" containerID="28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.201664 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.211109 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.232778 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.233130 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="proxy-httpd" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.233149 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="proxy-httpd" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.233162 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="ceilometer-notification-agent" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.233169 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="ceilometer-notification-agent" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.233194 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="ceilometer-central-agent" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.233199 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="ceilometer-central-agent" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.233222 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="sg-core" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.233227 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="sg-core" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.233408 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="sg-core" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.233433 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="proxy-httpd" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.233442 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="ceilometer-central-agent" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.233456 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="09533833-0903-487b-9963-e36425a64e8a" containerName="ceilometer-notification-agent" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.234992 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.237791 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.238074 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.238202 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.243963 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.249243 4740 scope.go:117] "RemoveContainer" containerID="804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.250458 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873\": container with ID starting with 804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873 not found: ID does not exist" containerID="804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.250493 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873"} err="failed to get container status \"804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873\": rpc error: code = NotFound desc = could not find container \"804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873\": container with ID starting with 804864f74dac8cc7249d36f9704d7f69737734e9d809893a89e7682cfa10f873 not found: ID does not exist" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.250519 4740 scope.go:117] "RemoveContainer" containerID="9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.250929 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda\": container with ID starting with 9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda not found: ID does not exist" containerID="9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.250952 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda"} err="failed to get container status \"9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda\": rpc error: code = NotFound desc = could not find container \"9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda\": container with ID starting with 9733b06e5448b82d0c0fafcee3ad5e381b32cc183d79cc18b4c79e4839c33bda not found: ID does not exist" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.250965 4740 scope.go:117] "RemoveContainer" containerID="0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.254534 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12\": container with ID starting with 0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12 not found: ID does not exist" containerID="0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.255883 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12"} err="failed to get container status \"0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12\": rpc error: code = NotFound desc = could not find container \"0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12\": container with ID starting with 0643a9bc8d8c0f8173997ec0ae715741b92e076e17421a4c282ce59d69996b12 not found: ID does not exist" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.255953 4740 scope.go:117] "RemoveContainer" containerID="28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.257785 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2\": container with ID starting with 28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2 not found: ID does not exist" containerID="28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.257811 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2"} err="failed to get container status \"28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2\": rpc error: code = NotFound desc = could not find container \"28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2\": container with ID starting with 28ea3223780372f8400a9eca5e232e0cdf1186c9376ced21a3e8f51f07ace5a2 not found: ID does not exist" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.412024 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5eefd278-fab1-4acc-acca-b6474799e6d1-run-httpd\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.412069 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.412111 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.412129 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-scripts\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.412158 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl985\" (UniqueName: \"kubernetes.io/projected/5eefd278-fab1-4acc-acca-b6474799e6d1-kube-api-access-cl985\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.412180 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.412206 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5eefd278-fab1-4acc-acca-b6474799e6d1-log-httpd\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.412417 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-config-data\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.438648 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.458993 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.462118 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514016 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5eefd278-fab1-4acc-acca-b6474799e6d1-run-httpd\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514078 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514145 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514168 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-scripts\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514215 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl985\" (UniqueName: \"kubernetes.io/projected/5eefd278-fab1-4acc-acca-b6474799e6d1-kube-api-access-cl985\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514248 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514271 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5eefd278-fab1-4acc-acca-b6474799e6d1-log-httpd\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514300 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-config-data\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.514771 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5eefd278-fab1-4acc-acca-b6474799e6d1-run-httpd\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.515013 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5eefd278-fab1-4acc-acca-b6474799e6d1-log-httpd\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.548407 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.548843 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.548952 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-config-data\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.548972 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-scripts\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.549363 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eefd278-fab1-4acc-acca-b6474799e6d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.554416 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl985\" (UniqueName: \"kubernetes.io/projected/5eefd278-fab1-4acc-acca-b6474799e6d1-kube-api-access-cl985\") pod \"ceilometer-0\" (UID: \"5eefd278-fab1-4acc-acca-b6474799e6d1\") " pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.565645 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.625792 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhfxt\" (UniqueName: \"kubernetes.io/projected/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-kube-api-access-bhfxt\") pod \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.625873 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-logs\") pod \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.625935 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-combined-ca-bundle\") pod \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.626086 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-config-data\") pod \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\" (UID: \"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade\") " Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.632179 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-logs" (OuterVolumeSpecName: "logs") pod "76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" (UID: "76ac811e-3c05-4c9e-b086-d6ff2e8e7ade"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.650951 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-kube-api-access-bhfxt" (OuterVolumeSpecName: "kube-api-access-bhfxt") pod "76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" (UID: "76ac811e-3c05-4c9e-b086-d6ff2e8e7ade"). InnerVolumeSpecName "kube-api-access-bhfxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.677958 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" (UID: "76ac811e-3c05-4c9e-b086-d6ff2e8e7ade"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.703422 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-config-data" (OuterVolumeSpecName: "config-data") pod "76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" (UID: "76ac811e-3c05-4c9e-b086-d6ff2e8e7ade"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.733894 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.733927 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhfxt\" (UniqueName: \"kubernetes.io/projected/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-kube-api-access-bhfxt\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.733939 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.733948 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.877151 4740 generic.go:334] "Generic (PLEG): container finished" podID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerID="aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d" exitCode=0 Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.877229 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.877241 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade","Type":"ContainerDied","Data":"aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d"} Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.877289 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76ac811e-3c05-4c9e-b086-d6ff2e8e7ade","Type":"ContainerDied","Data":"913009c408883c40606aec6a04f9d4ffdc45a021a6c6298139dd90d6fcbd5db1"} Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.877307 4740 scope.go:117] "RemoveContainer" containerID="aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.900424 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.912011 4740 scope.go:117] "RemoveContainer" containerID="1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.922011 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.936860 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.955800 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.956216 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-log" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.956228 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-log" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.956248 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-api" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.956254 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-api" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.956459 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-api" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.956473 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" containerName="nova-api-log" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.957464 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.961312 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.961535 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.962362 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.994873 4740 scope.go:117] "RemoveContainer" containerID="aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d" Oct 09 10:47:44 crc kubenswrapper[4740]: E1009 10:47:44.999653 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d\": container with ID starting with aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d not found: ID does not exist" containerID="aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.999689 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d"} err="failed to get container status \"aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d\": rpc error: code = NotFound desc = could not find container \"aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d\": container with ID starting with aa6ffea9c45aa695c43c57444146d22723ef90683ce847bf300ba6981445fe8d not found: ID does not exist" Oct 09 10:47:44 crc kubenswrapper[4740]: I1009 10:47:44.999710 4740 scope.go:117] "RemoveContainer" containerID="1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.000157 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:45 crc kubenswrapper[4740]: E1009 10:47:45.001504 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8\": container with ID starting with 1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8 not found: ID does not exist" containerID="1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.001542 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8"} err="failed to get container status \"1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8\": rpc error: code = NotFound desc = could not find container \"1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8\": container with ID starting with 1c8c1b5f805e09cb142090a02a1057ddcd4537ca022aaed6e2be0362eecb70b8 not found: ID does not exist" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.042088 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.042146 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-logs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.042191 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rsq4\" (UniqueName: \"kubernetes.io/projected/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-kube-api-access-2rsq4\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.043166 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-config-data\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.043217 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.043254 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-public-tls-certs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.115226 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 10:47:45 crc kubenswrapper[4740]: W1009 10:47:45.120634 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eefd278_fab1_4acc_acca_b6474799e6d1.slice/crio-327c855d156af159ae5ec8bcfb99141f88473b22d271a019c793bc2ae43e922b WatchSource:0}: Error finding container 327c855d156af159ae5ec8bcfb99141f88473b22d271a019c793bc2ae43e922b: Status 404 returned error can't find the container with id 327c855d156af159ae5ec8bcfb99141f88473b22d271a019c793bc2ae43e922b Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.143767 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xvz9n"] Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.144984 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-config-data\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.145059 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.145096 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-public-tls-certs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.145936 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.145971 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-logs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.146003 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.146028 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rsq4\" (UniqueName: \"kubernetes.io/projected/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-kube-api-access-2rsq4\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.146425 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-logs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.150338 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.150510 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-public-tls-certs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.150711 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-config-data\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.151012 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.156537 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.158597 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.161249 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xvz9n"] Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.168936 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rsq4\" (UniqueName: \"kubernetes.io/projected/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-kube-api-access-2rsq4\") pod \"nova-api-0\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.247464 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-config-data\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.247515 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-scripts\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.247585 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.247605 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srdd4\" (UniqueName: \"kubernetes.io/projected/5b098d98-b0c4-46f4-b79b-57a6405f0385-kube-api-access-srdd4\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.310374 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.349727 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.349788 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srdd4\" (UniqueName: \"kubernetes.io/projected/5b098d98-b0c4-46f4-b79b-57a6405f0385-kube-api-access-srdd4\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.349907 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-config-data\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.349947 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-scripts\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.353386 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.355212 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-scripts\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.366734 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-config-data\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.370514 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srdd4\" (UniqueName: \"kubernetes.io/projected/5b098d98-b0c4-46f4-b79b-57a6405f0385-kube-api-access-srdd4\") pod \"nova-cell1-cell-mapping-xvz9n\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.557991 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.772051 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09533833-0903-487b-9963-e36425a64e8a" path="/var/lib/kubelet/pods/09533833-0903-487b-9963-e36425a64e8a/volumes" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.772912 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76ac811e-3c05-4c9e-b086-d6ff2e8e7ade" path="/var/lib/kubelet/pods/76ac811e-3c05-4c9e-b086-d6ff2e8e7ade/volumes" Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.784848 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.894968 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fc8c636-1360-4580-b0e4-00d8b4d13e8b","Type":"ContainerStarted","Data":"c88a80ef118cc2bda4e31b0264086d85d3437e57bfe2a150c14cff788f679c95"} Oct 09 10:47:45 crc kubenswrapper[4740]: I1009 10:47:45.902380 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5eefd278-fab1-4acc-acca-b6474799e6d1","Type":"ContainerStarted","Data":"327c855d156af159ae5ec8bcfb99141f88473b22d271a019c793bc2ae43e922b"} Oct 09 10:47:46 crc kubenswrapper[4740]: I1009 10:47:46.045206 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xvz9n"] Oct 09 10:47:46 crc kubenswrapper[4740]: I1009 10:47:46.912842 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fc8c636-1360-4580-b0e4-00d8b4d13e8b","Type":"ContainerStarted","Data":"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6"} Oct 09 10:47:46 crc kubenswrapper[4740]: I1009 10:47:46.913238 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fc8c636-1360-4580-b0e4-00d8b4d13e8b","Type":"ContainerStarted","Data":"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b"} Oct 09 10:47:46 crc kubenswrapper[4740]: I1009 10:47:46.915407 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xvz9n" event={"ID":"5b098d98-b0c4-46f4-b79b-57a6405f0385","Type":"ContainerStarted","Data":"3c6b520bce85df793bad710b51aded2089c1f86036f6e559def420a2218311ce"} Oct 09 10:47:46 crc kubenswrapper[4740]: I1009 10:47:46.915441 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xvz9n" event={"ID":"5b098d98-b0c4-46f4-b79b-57a6405f0385","Type":"ContainerStarted","Data":"f78440fe4c95f2e339bd093ded517910f433f462102a2acabf2f8f1715b88abc"} Oct 09 10:47:46 crc kubenswrapper[4740]: I1009 10:47:46.932404 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5eefd278-fab1-4acc-acca-b6474799e6d1","Type":"ContainerStarted","Data":"81f33866eefa9549dbb49a17c6b225680dc59f5a607e7f7a386507e955431496"} Oct 09 10:47:46 crc kubenswrapper[4740]: I1009 10:47:46.943397 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.943372563 podStartE2EDuration="2.943372563s" podCreationTimestamp="2025-10-09 10:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:46.931058506 +0000 UTC m=+1205.893258887" watchObservedRunningTime="2025-10-09 10:47:46.943372563 +0000 UTC m=+1205.905572944" Oct 09 10:47:46 crc kubenswrapper[4740]: I1009 10:47:46.959555 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xvz9n" podStartSLOduration=1.959530795 podStartE2EDuration="1.959530795s" podCreationTimestamp="2025-10-09 10:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:46.95385279 +0000 UTC m=+1205.916053211" watchObservedRunningTime="2025-10-09 10:47:46.959530795 +0000 UTC m=+1205.921731176" Oct 09 10:47:47 crc kubenswrapper[4740]: I1009 10:47:47.950826 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5eefd278-fab1-4acc-acca-b6474799e6d1","Type":"ContainerStarted","Data":"1e0d9f9c7995d2d38b570f1120b4dde21545ea8ceaebb2d0f05d351609ce7ac4"} Oct 09 10:47:47 crc kubenswrapper[4740]: I1009 10:47:47.951149 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5eefd278-fab1-4acc-acca-b6474799e6d1","Type":"ContainerStarted","Data":"3ef5ae7ebea91aa972c97a144dbeb5be62c93e3c16fa2d980c7922b3321cb4bb"} Oct 09 10:47:48 crc kubenswrapper[4740]: I1009 10:47:48.299066 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:47:48 crc kubenswrapper[4740]: I1009 10:47:48.357956 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z977l"] Oct 09 10:47:48 crc kubenswrapper[4740]: I1009 10:47:48.358205 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" podUID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerName="dnsmasq-dns" containerID="cri-o://eff4fe15174ef6aec62b353465763dde306566a9255148808771642fdd0c4772" gracePeriod=10 Oct 09 10:47:48 crc kubenswrapper[4740]: I1009 10:47:48.961239 4740 generic.go:334] "Generic (PLEG): container finished" podID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerID="eff4fe15174ef6aec62b353465763dde306566a9255148808771642fdd0c4772" exitCode=0 Oct 09 10:47:48 crc kubenswrapper[4740]: I1009 10:47:48.961271 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" event={"ID":"f5a8492c-3f09-4613-a24f-3f17de65767d","Type":"ContainerDied","Data":"eff4fe15174ef6aec62b353465763dde306566a9255148808771642fdd0c4772"} Oct 09 10:47:48 crc kubenswrapper[4740]: I1009 10:47:48.961631 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" event={"ID":"f5a8492c-3f09-4613-a24f-3f17de65767d","Type":"ContainerDied","Data":"66c97661f3d2aedcc9364e7ecd58a78c5e621896ceb1fc6fe140124dace37f37"} Oct 09 10:47:48 crc kubenswrapper[4740]: I1009 10:47:48.961652 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66c97661f3d2aedcc9364e7ecd58a78c5e621896ceb1fc6fe140124dace37f37" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.022735 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.130508 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-swift-storage-0\") pod \"f5a8492c-3f09-4613-a24f-3f17de65767d\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.130591 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-nb\") pod \"f5a8492c-3f09-4613-a24f-3f17de65767d\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.130646 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-svc\") pod \"f5a8492c-3f09-4613-a24f-3f17de65767d\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.130720 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-sb\") pod \"f5a8492c-3f09-4613-a24f-3f17de65767d\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.130830 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-config\") pod \"f5a8492c-3f09-4613-a24f-3f17de65767d\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.130850 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pb56\" (UniqueName: \"kubernetes.io/projected/f5a8492c-3f09-4613-a24f-3f17de65767d-kube-api-access-9pb56\") pod \"f5a8492c-3f09-4613-a24f-3f17de65767d\" (UID: \"f5a8492c-3f09-4613-a24f-3f17de65767d\") " Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.140116 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a8492c-3f09-4613-a24f-3f17de65767d-kube-api-access-9pb56" (OuterVolumeSpecName: "kube-api-access-9pb56") pod "f5a8492c-3f09-4613-a24f-3f17de65767d" (UID: "f5a8492c-3f09-4613-a24f-3f17de65767d"). InnerVolumeSpecName "kube-api-access-9pb56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.185800 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5a8492c-3f09-4613-a24f-3f17de65767d" (UID: "f5a8492c-3f09-4613-a24f-3f17de65767d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.187236 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5a8492c-3f09-4613-a24f-3f17de65767d" (UID: "f5a8492c-3f09-4613-a24f-3f17de65767d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.187537 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5a8492c-3f09-4613-a24f-3f17de65767d" (UID: "f5a8492c-3f09-4613-a24f-3f17de65767d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.193934 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5a8492c-3f09-4613-a24f-3f17de65767d" (UID: "f5a8492c-3f09-4613-a24f-3f17de65767d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.199620 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-config" (OuterVolumeSpecName: "config") pod "f5a8492c-3f09-4613-a24f-3f17de65767d" (UID: "f5a8492c-3f09-4613-a24f-3f17de65767d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.232709 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.233004 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pb56\" (UniqueName: \"kubernetes.io/projected/f5a8492c-3f09-4613-a24f-3f17de65767d-kube-api-access-9pb56\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.233077 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.233139 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.233200 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.233253 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a8492c-3f09-4613-a24f-3f17de65767d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.975030 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.976462 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5eefd278-fab1-4acc-acca-b6474799e6d1","Type":"ContainerStarted","Data":"be63cda8df9273d4d871ef82086fe02041f437e034fd004509b20ccc6af72330"} Oct 09 10:47:49 crc kubenswrapper[4740]: I1009 10:47:49.976490 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 10:47:50 crc kubenswrapper[4740]: I1009 10:47:50.015683 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.303471734 podStartE2EDuration="6.015665337s" podCreationTimestamp="2025-10-09 10:47:44 +0000 UTC" firstStartedPulling="2025-10-09 10:47:45.123356839 +0000 UTC m=+1204.085557220" lastFinishedPulling="2025-10-09 10:47:48.835550442 +0000 UTC m=+1207.797750823" observedRunningTime="2025-10-09 10:47:50.008236474 +0000 UTC m=+1208.970436855" watchObservedRunningTime="2025-10-09 10:47:50.015665337 +0000 UTC m=+1208.977865718" Oct 09 10:47:50 crc kubenswrapper[4740]: I1009 10:47:50.027746 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z977l"] Oct 09 10:47:50 crc kubenswrapper[4740]: I1009 10:47:50.035955 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z977l"] Oct 09 10:47:50 crc kubenswrapper[4740]: I1009 10:47:50.990301 4740 generic.go:334] "Generic (PLEG): container finished" podID="5b098d98-b0c4-46f4-b79b-57a6405f0385" containerID="3c6b520bce85df793bad710b51aded2089c1f86036f6e559def420a2218311ce" exitCode=0 Oct 09 10:47:50 crc kubenswrapper[4740]: I1009 10:47:50.990383 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xvz9n" event={"ID":"5b098d98-b0c4-46f4-b79b-57a6405f0385","Type":"ContainerDied","Data":"3c6b520bce85df793bad710b51aded2089c1f86036f6e559def420a2218311ce"} Oct 09 10:47:51 crc kubenswrapper[4740]: I1009 10:47:51.767795 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a8492c-3f09-4613-a24f-3f17de65767d" path="/var/lib/kubelet/pods/f5a8492c-3f09-4613-a24f-3f17de65767d/volumes" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.365827 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.491970 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-config-data\") pod \"5b098d98-b0c4-46f4-b79b-57a6405f0385\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.492022 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srdd4\" (UniqueName: \"kubernetes.io/projected/5b098d98-b0c4-46f4-b79b-57a6405f0385-kube-api-access-srdd4\") pod \"5b098d98-b0c4-46f4-b79b-57a6405f0385\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.492107 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-combined-ca-bundle\") pod \"5b098d98-b0c4-46f4-b79b-57a6405f0385\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.492206 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-scripts\") pod \"5b098d98-b0c4-46f4-b79b-57a6405f0385\" (UID: \"5b098d98-b0c4-46f4-b79b-57a6405f0385\") " Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.500962 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-scripts" (OuterVolumeSpecName: "scripts") pod "5b098d98-b0c4-46f4-b79b-57a6405f0385" (UID: "5b098d98-b0c4-46f4-b79b-57a6405f0385"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.501489 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b098d98-b0c4-46f4-b79b-57a6405f0385-kube-api-access-srdd4" (OuterVolumeSpecName: "kube-api-access-srdd4") pod "5b098d98-b0c4-46f4-b79b-57a6405f0385" (UID: "5b098d98-b0c4-46f4-b79b-57a6405f0385"). InnerVolumeSpecName "kube-api-access-srdd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.535552 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-config-data" (OuterVolumeSpecName: "config-data") pod "5b098d98-b0c4-46f4-b79b-57a6405f0385" (UID: "5b098d98-b0c4-46f4-b79b-57a6405f0385"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.536650 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b098d98-b0c4-46f4-b79b-57a6405f0385" (UID: "5b098d98-b0c4-46f4-b79b-57a6405f0385"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.593947 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.593998 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srdd4\" (UniqueName: \"kubernetes.io/projected/5b098d98-b0c4-46f4-b79b-57a6405f0385-kube-api-access-srdd4\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.594009 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:52 crc kubenswrapper[4740]: I1009 10:47:52.594018 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b098d98-b0c4-46f4-b79b-57a6405f0385-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.009909 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xvz9n" event={"ID":"5b098d98-b0c4-46f4-b79b-57a6405f0385","Type":"ContainerDied","Data":"f78440fe4c95f2e339bd093ded517910f433f462102a2acabf2f8f1715b88abc"} Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.009974 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f78440fe4c95f2e339bd093ded517910f433f462102a2acabf2f8f1715b88abc" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.010056 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xvz9n" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.184734 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.185180 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerName="nova-api-log" containerID="cri-o://e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b" gracePeriod=30 Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.185262 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerName="nova-api-api" containerID="cri-o://f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6" gracePeriod=30 Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.194536 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.194844 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ef87ed9c-12c3-4ee4-9011-a826acaab478" containerName="nova-scheduler-scheduler" containerID="cri-o://7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59" gracePeriod=30 Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.296678 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.296955 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-log" containerID="cri-o://052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752" gracePeriod=30 Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.297473 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-metadata" containerID="cri-o://b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538" gracePeriod=30 Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.797147 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-845d6d6f59-z977l" podUID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.192:5353: i/o timeout" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.811666 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.920740 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-logs\") pod \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.920834 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-public-tls-certs\") pod \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.920866 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-combined-ca-bundle\") pod \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.920889 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-internal-tls-certs\") pod \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.921094 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-config-data\") pod \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.921264 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rsq4\" (UniqueName: \"kubernetes.io/projected/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-kube-api-access-2rsq4\") pod \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\" (UID: \"5fc8c636-1360-4580-b0e4-00d8b4d13e8b\") " Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.921944 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-logs" (OuterVolumeSpecName: "logs") pod "5fc8c636-1360-4580-b0e4-00d8b4d13e8b" (UID: "5fc8c636-1360-4580-b0e4-00d8b4d13e8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.926887 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-kube-api-access-2rsq4" (OuterVolumeSpecName: "kube-api-access-2rsq4") pod "5fc8c636-1360-4580-b0e4-00d8b4d13e8b" (UID: "5fc8c636-1360-4580-b0e4-00d8b4d13e8b"). InnerVolumeSpecName "kube-api-access-2rsq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.952517 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fc8c636-1360-4580-b0e4-00d8b4d13e8b" (UID: "5fc8c636-1360-4580-b0e4-00d8b4d13e8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.958820 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-config-data" (OuterVolumeSpecName: "config-data") pod "5fc8c636-1360-4580-b0e4-00d8b4d13e8b" (UID: "5fc8c636-1360-4580-b0e4-00d8b4d13e8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.987174 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5fc8c636-1360-4580-b0e4-00d8b4d13e8b" (UID: "5fc8c636-1360-4580-b0e4-00d8b4d13e8b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:53 crc kubenswrapper[4740]: I1009 10:47:53.989598 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5fc8c636-1360-4580-b0e4-00d8b4d13e8b" (UID: "5fc8c636-1360-4580-b0e4-00d8b4d13e8b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.023941 4740 generic.go:334] "Generic (PLEG): container finished" podID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerID="f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6" exitCode=0 Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.023971 4740 generic.go:334] "Generic (PLEG): container finished" podID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerID="e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b" exitCode=143 Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.024007 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fc8c636-1360-4580-b0e4-00d8b4d13e8b","Type":"ContainerDied","Data":"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6"} Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.024040 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fc8c636-1360-4580-b0e4-00d8b4d13e8b","Type":"ContainerDied","Data":"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b"} Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.024059 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fc8c636-1360-4580-b0e4-00d8b4d13e8b","Type":"ContainerDied","Data":"c88a80ef118cc2bda4e31b0264086d85d3437e57bfe2a150c14cff788f679c95"} Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.024050 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.024098 4740 scope.go:117] "RemoveContainer" containerID="f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.032209 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.032239 4740 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.032249 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.032260 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rsq4\" (UniqueName: \"kubernetes.io/projected/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-kube-api-access-2rsq4\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.032268 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.032276 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc8c636-1360-4580-b0e4-00d8b4d13e8b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.037872 4740 generic.go:334] "Generic (PLEG): container finished" podID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerID="052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752" exitCode=143 Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.037929 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c44a55b0-facb-44d5-8343-f2274cc5171d","Type":"ContainerDied","Data":"052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752"} Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.064505 4740 scope.go:117] "RemoveContainer" containerID="e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.066905 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.074604 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.086407 4740 scope.go:117] "RemoveContainer" containerID="f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6" Oct 09 10:47:54 crc kubenswrapper[4740]: E1009 10:47:54.086980 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6\": container with ID starting with f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6 not found: ID does not exist" containerID="f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.087028 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6"} err="failed to get container status \"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6\": rpc error: code = NotFound desc = could not find container \"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6\": container with ID starting with f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6 not found: ID does not exist" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.087060 4740 scope.go:117] "RemoveContainer" containerID="e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b" Oct 09 10:47:54 crc kubenswrapper[4740]: E1009 10:47:54.087366 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b\": container with ID starting with e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b not found: ID does not exist" containerID="e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.087393 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b"} err="failed to get container status \"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b\": rpc error: code = NotFound desc = could not find container \"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b\": container with ID starting with e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b not found: ID does not exist" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.087410 4740 scope.go:117] "RemoveContainer" containerID="f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.087680 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6"} err="failed to get container status \"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6\": rpc error: code = NotFound desc = could not find container \"f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6\": container with ID starting with f80a124101758a10a5749cfc88906d83d87fd3c9b4d6a1dad7b1cac08ee84bb6 not found: ID does not exist" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.087728 4740 scope.go:117] "RemoveContainer" containerID="e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.088062 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b"} err="failed to get container status \"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b\": rpc error: code = NotFound desc = could not find container \"e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b\": container with ID starting with e24bd2aed454c78b4c5d1cc3149ebfc14598e23ba6c925e866641c978ad2ec7b not found: ID does not exist" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092231 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:54 crc kubenswrapper[4740]: E1009 10:47:54.092618 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerName="nova-api-log" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092642 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerName="nova-api-log" Oct 09 10:47:54 crc kubenswrapper[4740]: E1009 10:47:54.092658 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerName="init" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092666 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerName="init" Oct 09 10:47:54 crc kubenswrapper[4740]: E1009 10:47:54.092678 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b098d98-b0c4-46f4-b79b-57a6405f0385" containerName="nova-manage" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092685 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b098d98-b0c4-46f4-b79b-57a6405f0385" containerName="nova-manage" Oct 09 10:47:54 crc kubenswrapper[4740]: E1009 10:47:54.092701 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerName="nova-api-api" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092711 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerName="nova-api-api" Oct 09 10:47:54 crc kubenswrapper[4740]: E1009 10:47:54.092742 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerName="dnsmasq-dns" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092750 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerName="dnsmasq-dns" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092945 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerName="nova-api-log" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092961 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" containerName="nova-api-api" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092975 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a8492c-3f09-4613-a24f-3f17de65767d" containerName="dnsmasq-dns" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.092983 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b098d98-b0c4-46f4-b79b-57a6405f0385" containerName="nova-manage" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.094005 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.096514 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.096781 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.099325 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.110825 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.237427 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260e0d21-3655-4a0d-a51e-6c483e20c7f5-logs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.237641 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-config-data\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.237810 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-public-tls-certs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.237867 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jw6\" (UniqueName: \"kubernetes.io/projected/260e0d21-3655-4a0d-a51e-6c483e20c7f5-kube-api-access-75jw6\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.237892 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.238132 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.341214 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-config-data\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.341286 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-public-tls-certs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.341312 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75jw6\" (UniqueName: \"kubernetes.io/projected/260e0d21-3655-4a0d-a51e-6c483e20c7f5-kube-api-access-75jw6\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.341329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.341377 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.341474 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260e0d21-3655-4a0d-a51e-6c483e20c7f5-logs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.341968 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260e0d21-3655-4a0d-a51e-6c483e20c7f5-logs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.347710 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.347710 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-public-tls-certs\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.347864 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-config-data\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.348400 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260e0d21-3655-4a0d-a51e-6c483e20c7f5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.359287 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jw6\" (UniqueName: \"kubernetes.io/projected/260e0d21-3655-4a0d-a51e-6c483e20c7f5-kube-api-access-75jw6\") pod \"nova-api-0\" (UID: \"260e0d21-3655-4a0d-a51e-6c483e20c7f5\") " pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.417632 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.907484 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:47:54 crc kubenswrapper[4740]: I1009 10:47:54.942221 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.048335 4740 generic.go:334] "Generic (PLEG): container finished" podID="ef87ed9c-12c3-4ee4-9011-a826acaab478" containerID="7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59" exitCode=0 Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.048423 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.048453 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef87ed9c-12c3-4ee4-9011-a826acaab478","Type":"ContainerDied","Data":"7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59"} Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.048487 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef87ed9c-12c3-4ee4-9011-a826acaab478","Type":"ContainerDied","Data":"b35c45ab00c4c668ba93e95aef056125f07d1e7cb0470de6eb96cbee56829528"} Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.048505 4740 scope.go:117] "RemoveContainer" containerID="7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.049495 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"260e0d21-3655-4a0d-a51e-6c483e20c7f5","Type":"ContainerStarted","Data":"2c1ecd83d0214aa01f3e8c05a8b190063131ffb0cfb0fbb5eb074be5a99d9d95"} Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.067885 4740 scope.go:117] "RemoveContainer" containerID="7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59" Oct 09 10:47:55 crc kubenswrapper[4740]: E1009 10:47:55.068431 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59\": container with ID starting with 7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59 not found: ID does not exist" containerID="7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.068468 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59"} err="failed to get container status \"7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59\": rpc error: code = NotFound desc = could not find container \"7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59\": container with ID starting with 7a00d6968d0cce600872635acd59d6f9076dc2ff201b4eb9fd26a49cf4211f59 not found: ID does not exist" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.080370 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wfk\" (UniqueName: \"kubernetes.io/projected/ef87ed9c-12c3-4ee4-9011-a826acaab478-kube-api-access-s7wfk\") pod \"ef87ed9c-12c3-4ee4-9011-a826acaab478\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.080589 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-config-data\") pod \"ef87ed9c-12c3-4ee4-9011-a826acaab478\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.080674 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-combined-ca-bundle\") pod \"ef87ed9c-12c3-4ee4-9011-a826acaab478\" (UID: \"ef87ed9c-12c3-4ee4-9011-a826acaab478\") " Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.085640 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef87ed9c-12c3-4ee4-9011-a826acaab478-kube-api-access-s7wfk" (OuterVolumeSpecName: "kube-api-access-s7wfk") pod "ef87ed9c-12c3-4ee4-9011-a826acaab478" (UID: "ef87ed9c-12c3-4ee4-9011-a826acaab478"). InnerVolumeSpecName "kube-api-access-s7wfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.108489 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-config-data" (OuterVolumeSpecName: "config-data") pod "ef87ed9c-12c3-4ee4-9011-a826acaab478" (UID: "ef87ed9c-12c3-4ee4-9011-a826acaab478"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.110108 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef87ed9c-12c3-4ee4-9011-a826acaab478" (UID: "ef87ed9c-12c3-4ee4-9011-a826acaab478"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.183841 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wfk\" (UniqueName: \"kubernetes.io/projected/ef87ed9c-12c3-4ee4-9011-a826acaab478-kube-api-access-s7wfk\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.183880 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.183893 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef87ed9c-12c3-4ee4-9011-a826acaab478-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.381481 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.393006 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.427315 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:55 crc kubenswrapper[4740]: E1009 10:47:55.428071 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef87ed9c-12c3-4ee4-9011-a826acaab478" containerName="nova-scheduler-scheduler" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.428092 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef87ed9c-12c3-4ee4-9011-a826acaab478" containerName="nova-scheduler-scheduler" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.428312 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef87ed9c-12c3-4ee4-9011-a826acaab478" containerName="nova-scheduler-scheduler" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.429067 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.435060 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.438706 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.591288 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb674db-a6fb-4100-82d2-2fae6660902b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.591518 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb674db-a6fb-4100-82d2-2fae6660902b-config-data\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.591570 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-854p4\" (UniqueName: \"kubernetes.io/projected/3bb674db-a6fb-4100-82d2-2fae6660902b-kube-api-access-854p4\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.693051 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb674db-a6fb-4100-82d2-2fae6660902b-config-data\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.693119 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-854p4\" (UniqueName: \"kubernetes.io/projected/3bb674db-a6fb-4100-82d2-2fae6660902b-kube-api-access-854p4\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.693162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb674db-a6fb-4100-82d2-2fae6660902b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.698061 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb674db-a6fb-4100-82d2-2fae6660902b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.699468 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb674db-a6fb-4100-82d2-2fae6660902b-config-data\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.709300 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-854p4\" (UniqueName: \"kubernetes.io/projected/3bb674db-a6fb-4100-82d2-2fae6660902b-kube-api-access-854p4\") pod \"nova-scheduler-0\" (UID: \"3bb674db-a6fb-4100-82d2-2fae6660902b\") " pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.758714 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.774900 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc8c636-1360-4580-b0e4-00d8b4d13e8b" path="/var/lib/kubelet/pods/5fc8c636-1360-4580-b0e4-00d8b4d13e8b/volumes" Oct 09 10:47:55 crc kubenswrapper[4740]: I1009 10:47:55.776193 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef87ed9c-12c3-4ee4-9011-a826acaab478" path="/var/lib/kubelet/pods/ef87ed9c-12c3-4ee4-9011-a826acaab478/volumes" Oct 09 10:47:56 crc kubenswrapper[4740]: I1009 10:47:56.142305 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"260e0d21-3655-4a0d-a51e-6c483e20c7f5","Type":"ContainerStarted","Data":"e930099d6db794561b73c39c04c6c5705ef25f76ce06ad71585d936f44ce15eb"} Oct 09 10:47:56 crc kubenswrapper[4740]: I1009 10:47:56.142687 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"260e0d21-3655-4a0d-a51e-6c483e20c7f5","Type":"ContainerStarted","Data":"250c832ab6d31a34759744a0fb2cf030ba1c9646442095531aa19df70e81499e"} Oct 09 10:47:56 crc kubenswrapper[4740]: I1009 10:47:56.196965 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.196943987 podStartE2EDuration="2.196943987s" podCreationTimestamp="2025-10-09 10:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:56.178241446 +0000 UTC m=+1215.140441827" watchObservedRunningTime="2025-10-09 10:47:56.196943987 +0000 UTC m=+1215.159144368" Oct 09 10:47:56 crc kubenswrapper[4740]: I1009 10:47:56.288952 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 10:47:56 crc kubenswrapper[4740]: I1009 10:47:56.433670 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:60642->10.217.0.197:8775: read: connection reset by peer" Oct 09 10:47:56 crc kubenswrapper[4740]: I1009 10:47:56.434005 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:60656->10.217.0.197:8775: read: connection reset by peer" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:56.878921 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.029028 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-combined-ca-bundle\") pod \"c44a55b0-facb-44d5-8343-f2274cc5171d\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.029450 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56rkf\" (UniqueName: \"kubernetes.io/projected/c44a55b0-facb-44d5-8343-f2274cc5171d-kube-api-access-56rkf\") pod \"c44a55b0-facb-44d5-8343-f2274cc5171d\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.029492 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-config-data\") pod \"c44a55b0-facb-44d5-8343-f2274cc5171d\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.029550 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44a55b0-facb-44d5-8343-f2274cc5171d-logs\") pod \"c44a55b0-facb-44d5-8343-f2274cc5171d\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.029688 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-nova-metadata-tls-certs\") pod \"c44a55b0-facb-44d5-8343-f2274cc5171d\" (UID: \"c44a55b0-facb-44d5-8343-f2274cc5171d\") " Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.034087 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c44a55b0-facb-44d5-8343-f2274cc5171d-logs" (OuterVolumeSpecName: "logs") pod "c44a55b0-facb-44d5-8343-f2274cc5171d" (UID: "c44a55b0-facb-44d5-8343-f2274cc5171d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.037827 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44a55b0-facb-44d5-8343-f2274cc5171d-kube-api-access-56rkf" (OuterVolumeSpecName: "kube-api-access-56rkf") pod "c44a55b0-facb-44d5-8343-f2274cc5171d" (UID: "c44a55b0-facb-44d5-8343-f2274cc5171d"). InnerVolumeSpecName "kube-api-access-56rkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.064532 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-config-data" (OuterVolumeSpecName: "config-data") pod "c44a55b0-facb-44d5-8343-f2274cc5171d" (UID: "c44a55b0-facb-44d5-8343-f2274cc5171d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.067621 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c44a55b0-facb-44d5-8343-f2274cc5171d" (UID: "c44a55b0-facb-44d5-8343-f2274cc5171d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.083823 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c44a55b0-facb-44d5-8343-f2274cc5171d" (UID: "c44a55b0-facb-44d5-8343-f2274cc5171d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.134137 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44a55b0-facb-44d5-8343-f2274cc5171d-logs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.134165 4740 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.134175 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.134183 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56rkf\" (UniqueName: \"kubernetes.io/projected/c44a55b0-facb-44d5-8343-f2274cc5171d-kube-api-access-56rkf\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.134192 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44a55b0-facb-44d5-8343-f2274cc5171d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.153974 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3bb674db-a6fb-4100-82d2-2fae6660902b","Type":"ContainerStarted","Data":"fe9803a7dcf0b0931299a9e273c023d6c212c916fc0828d293a4e1b945bdd396"} Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.154019 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3bb674db-a6fb-4100-82d2-2fae6660902b","Type":"ContainerStarted","Data":"8c72cb77975c8f4b0756905c2de09fade1285e3e1bd876f6707af8d5339554ae"} Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.156856 4740 generic.go:334] "Generic (PLEG): container finished" podID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerID="b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538" exitCode=0 Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.157274 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.157332 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c44a55b0-facb-44d5-8343-f2274cc5171d","Type":"ContainerDied","Data":"b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538"} Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.157379 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c44a55b0-facb-44d5-8343-f2274cc5171d","Type":"ContainerDied","Data":"b4d43e2fc764aba65b513260a7cb2b75b29f101ed22f5408675bf501c020be95"} Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.157401 4740 scope.go:117] "RemoveContainer" containerID="b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.176534 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.176516898 podStartE2EDuration="2.176516898s" podCreationTimestamp="2025-10-09 10:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:57.172033255 +0000 UTC m=+1216.134233636" watchObservedRunningTime="2025-10-09 10:47:57.176516898 +0000 UTC m=+1216.138717279" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.202597 4740 scope.go:117] "RemoveContainer" containerID="052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.220113 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.237485 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.246610 4740 scope.go:117] "RemoveContainer" containerID="b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538" Oct 09 10:47:57 crc kubenswrapper[4740]: E1009 10:47:57.247152 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538\": container with ID starting with b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538 not found: ID does not exist" containerID="b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.247181 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538"} err="failed to get container status \"b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538\": rpc error: code = NotFound desc = could not find container \"b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538\": container with ID starting with b6ea6a86794b59fb1d84d184f594a9fdf55672db0a12b37f300f8972e09b7538 not found: ID does not exist" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.247202 4740 scope.go:117] "RemoveContainer" containerID="052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752" Oct 09 10:47:57 crc kubenswrapper[4740]: E1009 10:47:57.247475 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752\": container with ID starting with 052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752 not found: ID does not exist" containerID="052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.247495 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752"} err="failed to get container status \"052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752\": rpc error: code = NotFound desc = could not find container \"052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752\": container with ID starting with 052e8ce478536c9612c888f5a22b2015f3fc1d6b3f73396fccf725ed062f5752 not found: ID does not exist" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.254693 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:57 crc kubenswrapper[4740]: E1009 10:47:57.256306 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-log" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.256361 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-log" Oct 09 10:47:57 crc kubenswrapper[4740]: E1009 10:47:57.256398 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-metadata" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.256431 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-metadata" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.256779 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-metadata" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.256804 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" containerName="nova-metadata-log" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.258597 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.261438 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.265070 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.265906 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.345416 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-config-data\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.345490 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.345540 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.345621 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hclf5\" (UniqueName: \"kubernetes.io/projected/df3b597d-a996-4ebd-b896-61c6c62a0145-kube-api-access-hclf5\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.345649 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3b597d-a996-4ebd-b896-61c6c62a0145-logs\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.447379 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-config-data\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.447480 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.447523 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.448127 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hclf5\" (UniqueName: \"kubernetes.io/projected/df3b597d-a996-4ebd-b896-61c6c62a0145-kube-api-access-hclf5\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.448174 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3b597d-a996-4ebd-b896-61c6c62a0145-logs\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.448626 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3b597d-a996-4ebd-b896-61c6c62a0145-logs\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.452451 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.456730 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-config-data\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.467498 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3b597d-a996-4ebd-b896-61c6c62a0145-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.473714 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hclf5\" (UniqueName: \"kubernetes.io/projected/df3b597d-a996-4ebd-b896-61c6c62a0145-kube-api-access-hclf5\") pod \"nova-metadata-0\" (UID: \"df3b597d-a996-4ebd-b896-61c6c62a0145\") " pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.581384 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.798790 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44a55b0-facb-44d5-8343-f2274cc5171d" path="/var/lib/kubelet/pods/c44a55b0-facb-44d5-8343-f2274cc5171d/volumes" Oct 09 10:47:57 crc kubenswrapper[4740]: I1009 10:47:57.842577 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 10:47:57 crc kubenswrapper[4740]: W1009 10:47:57.844102 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf3b597d_a996_4ebd_b896_61c6c62a0145.slice/crio-f6edb09d2903101b53b17218066c937ef65dcb4a53210c3e2bd7b3b358d3525f WatchSource:0}: Error finding container f6edb09d2903101b53b17218066c937ef65dcb4a53210c3e2bd7b3b358d3525f: Status 404 returned error can't find the container with id f6edb09d2903101b53b17218066c937ef65dcb4a53210c3e2bd7b3b358d3525f Oct 09 10:47:58 crc kubenswrapper[4740]: I1009 10:47:58.173779 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3b597d-a996-4ebd-b896-61c6c62a0145","Type":"ContainerStarted","Data":"634540f03b3dce4539e0cb242f7de85863fc658c2d88b212e0de8c947baa4088"} Oct 09 10:47:58 crc kubenswrapper[4740]: I1009 10:47:58.174167 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3b597d-a996-4ebd-b896-61c6c62a0145","Type":"ContainerStarted","Data":"f6edb09d2903101b53b17218066c937ef65dcb4a53210c3e2bd7b3b358d3525f"} Oct 09 10:47:59 crc kubenswrapper[4740]: I1009 10:47:59.186097 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3b597d-a996-4ebd-b896-61c6c62a0145","Type":"ContainerStarted","Data":"f774919351fdf0dd9a4c121319f140fbdf3f3debb8bf56bd4d7bcc24024b5477"} Oct 09 10:47:59 crc kubenswrapper[4740]: I1009 10:47:59.210658 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.210644398 podStartE2EDuration="2.210644398s" podCreationTimestamp="2025-10-09 10:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:47:59.209420205 +0000 UTC m=+1218.171620586" watchObservedRunningTime="2025-10-09 10:47:59.210644398 +0000 UTC m=+1218.172844779" Oct 09 10:48:00 crc kubenswrapper[4740]: I1009 10:48:00.759828 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 10:48:02 crc kubenswrapper[4740]: I1009 10:48:02.581929 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 10:48:02 crc kubenswrapper[4740]: I1009 10:48:02.582295 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 10:48:04 crc kubenswrapper[4740]: I1009 10:48:04.418684 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 10:48:04 crc kubenswrapper[4740]: I1009 10:48:04.419090 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 10:48:05 crc kubenswrapper[4740]: I1009 10:48:05.435120 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="260e0d21-3655-4a0d-a51e-6c483e20c7f5" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 10:48:05 crc kubenswrapper[4740]: I1009 10:48:05.435144 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="260e0d21-3655-4a0d-a51e-6c483e20c7f5" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 10:48:05 crc kubenswrapper[4740]: I1009 10:48:05.774635 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 10:48:05 crc kubenswrapper[4740]: I1009 10:48:05.810858 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 10:48:06 crc kubenswrapper[4740]: I1009 10:48:06.306238 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 10:48:07 crc kubenswrapper[4740]: I1009 10:48:07.581852 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 10:48:07 crc kubenswrapper[4740]: I1009 10:48:07.582186 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 10:48:08 crc kubenswrapper[4740]: I1009 10:48:08.597426 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="df3b597d-a996-4ebd-b896-61c6c62a0145" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 10:48:08 crc kubenswrapper[4740]: I1009 10:48:08.598470 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="df3b597d-a996-4ebd-b896-61c6c62a0145" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 10:48:14 crc kubenswrapper[4740]: I1009 10:48:14.429354 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 10:48:14 crc kubenswrapper[4740]: I1009 10:48:14.430305 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 10:48:14 crc kubenswrapper[4740]: I1009 10:48:14.433827 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 10:48:14 crc kubenswrapper[4740]: I1009 10:48:14.436625 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 10:48:14 crc kubenswrapper[4740]: I1009 10:48:14.576848 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 10:48:15 crc kubenswrapper[4740]: I1009 10:48:15.365142 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 10:48:15 crc kubenswrapper[4740]: I1009 10:48:15.371358 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 10:48:17 crc kubenswrapper[4740]: I1009 10:48:17.588419 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 10:48:17 crc kubenswrapper[4740]: I1009 10:48:17.588703 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 10:48:17 crc kubenswrapper[4740]: I1009 10:48:17.592686 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 10:48:17 crc kubenswrapper[4740]: I1009 10:48:17.593275 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 10:48:26 crc kubenswrapper[4740]: I1009 10:48:26.110422 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:48:27 crc kubenswrapper[4740]: I1009 10:48:27.584668 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:48:29 crc kubenswrapper[4740]: I1009 10:48:29.889098 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerName="rabbitmq" containerID="cri-o://01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368" gracePeriod=604797 Oct 09 10:48:31 crc kubenswrapper[4740]: I1009 10:48:31.377852 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="187134d2-2fe9-4beb-beff-6a48162a1933" containerName="rabbitmq" containerID="cri-o://94f0634b0ed255b557447062d5631f7ff62524a41768d6c0fafad907dce032a4" gracePeriod=604797 Oct 09 10:48:32 crc kubenswrapper[4740]: I1009 10:48:32.273415 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 09 10:48:32 crc kubenswrapper[4740]: I1009 10:48:32.560358 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="187134d2-2fe9-4beb-beff-6a48162a1933" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.552827 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.621483 4740 generic.go:334] "Generic (PLEG): container finished" podID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerID="01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368" exitCode=0 Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.621534 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.621538 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa98dfc6-da2e-42b0-a620-a07230e1833d","Type":"ContainerDied","Data":"01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368"} Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.621649 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa98dfc6-da2e-42b0-a620-a07230e1833d","Type":"ContainerDied","Data":"15c391a994676971ddfb4fcc296685cf22f3a13685de4b7fc64e8f38c2640173"} Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.621666 4740 scope.go:117] "RemoveContainer" containerID="01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.654807 4740 scope.go:117] "RemoveContainer" containerID="cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.679661 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-plugins-conf\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.679893 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-server-conf\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.679937 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx5x7\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-kube-api-access-wx5x7\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.679959 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-erlang-cookie\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.679977 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-plugins\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.680026 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa98dfc6-da2e-42b0-a620-a07230e1833d-pod-info\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.680046 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-tls\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.680064 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-config-data\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.680103 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa98dfc6-da2e-42b0-a620-a07230e1833d-erlang-cookie-secret\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.680144 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.680173 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-confd\") pod \"aa98dfc6-da2e-42b0-a620-a07230e1833d\" (UID: \"aa98dfc6-da2e-42b0-a620-a07230e1833d\") " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.680666 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.681155 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.681633 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.687137 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-kube-api-access-wx5x7" (OuterVolumeSpecName: "kube-api-access-wx5x7") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "kube-api-access-wx5x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.687182 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.687674 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/aa98dfc6-da2e-42b0-a620-a07230e1833d-pod-info" (OuterVolumeSpecName: "pod-info") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.687677 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa98dfc6-da2e-42b0-a620-a07230e1833d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.690722 4740 scope.go:117] "RemoveContainer" containerID="01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368" Oct 09 10:48:36 crc kubenswrapper[4740]: E1009 10:48:36.691141 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368\": container with ID starting with 01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368 not found: ID does not exist" containerID="01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.691202 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368"} err="failed to get container status \"01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368\": rpc error: code = NotFound desc = could not find container \"01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368\": container with ID starting with 01b99e23c64ae6330a8f471ea454aa9650f76d3f7505ba0da69597a8e2af2368 not found: ID does not exist" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.691231 4740 scope.go:117] "RemoveContainer" containerID="cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.691510 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: E1009 10:48:36.691658 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c\": container with ID starting with cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c not found: ID does not exist" containerID="cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.691717 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c"} err="failed to get container status \"cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c\": rpc error: code = NotFound desc = could not find container \"cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c\": container with ID starting with cdd4feba6cd032d418bc8180dd1a1569db9bc194b9a8d185360898a2b39c3a5c not found: ID does not exist" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.741564 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-server-conf" (OuterVolumeSpecName: "server-conf") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.743609 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-config-data" (OuterVolumeSpecName: "config-data") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782236 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx5x7\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-kube-api-access-wx5x7\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782272 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782286 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782297 4740 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa98dfc6-da2e-42b0-a620-a07230e1833d-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782308 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782319 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782330 4740 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa98dfc6-da2e-42b0-a620-a07230e1833d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782354 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782364 4740 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.782374 4740 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa98dfc6-da2e-42b0-a620-a07230e1833d-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.805625 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.826028 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "aa98dfc6-da2e-42b0-a620-a07230e1833d" (UID: "aa98dfc6-da2e-42b0-a620-a07230e1833d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.883897 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.883929 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa98dfc6-da2e-42b0-a620-a07230e1833d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.964911 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.973261 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.994475 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:48:36 crc kubenswrapper[4740]: E1009 10:48:36.994976 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerName="rabbitmq" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.995003 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerName="rabbitmq" Oct 09 10:48:36 crc kubenswrapper[4740]: E1009 10:48:36.995029 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerName="setup-container" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.995038 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerName="setup-container" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.995266 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa98dfc6-da2e-42b0-a620-a07230e1833d" containerName="rabbitmq" Oct 09 10:48:36 crc kubenswrapper[4740]: I1009 10:48:36.997008 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.007335 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.008636 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.008728 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.009074 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.009221 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-k7bn4" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.009442 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.009603 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.029400 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189217 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189570 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189594 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189632 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189672 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189725 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189744 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189788 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzn2s\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-kube-api-access-wzn2s\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189855 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189886 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.189925 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291330 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291402 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291427 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291484 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291526 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291577 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291600 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291629 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzn2s\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-kube-api-access-wzn2s\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291686 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291717 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.291769 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.292819 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.292954 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.293046 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.293067 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.293351 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.293869 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.297473 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.298264 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.298408 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.302535 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.317480 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzn2s\" (UniqueName: \"kubernetes.io/projected/ff4b6585-91c6-48f8-ba40-5cd075c7c59e-kube-api-access-wzn2s\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.324966 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"ff4b6585-91c6-48f8-ba40-5cd075c7c59e\") " pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.628399 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.639032 4740 generic.go:334] "Generic (PLEG): container finished" podID="187134d2-2fe9-4beb-beff-6a48162a1933" containerID="94f0634b0ed255b557447062d5631f7ff62524a41768d6c0fafad907dce032a4" exitCode=0 Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.639091 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"187134d2-2fe9-4beb-beff-6a48162a1933","Type":"ContainerDied","Data":"94f0634b0ed255b557447062d5631f7ff62524a41768d6c0fafad907dce032a4"} Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.766502 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa98dfc6-da2e-42b0-a620-a07230e1833d" path="/var/lib/kubelet/pods/aa98dfc6-da2e-42b0-a620-a07230e1833d/volumes" Oct 09 10:48:37 crc kubenswrapper[4740]: I1009 10:48:37.908780 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027062 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-server-conf\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027136 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-tls\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027188 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-plugins-conf\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027267 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/187134d2-2fe9-4beb-beff-6a48162a1933-pod-info\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027299 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-config-data\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027339 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-confd\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027396 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-plugins\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027430 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db275\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-kube-api-access-db275\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.027511 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-erlang-cookie\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.028498 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.028536 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/187134d2-2fe9-4beb-beff-6a48162a1933-erlang-cookie-secret\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.028805 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.029133 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.029419 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.029685 4740 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.029698 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.029710 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.033445 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.034913 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187134d2-2fe9-4beb-beff-6a48162a1933-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.036892 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.048470 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-kube-api-access-db275" (OuterVolumeSpecName: "kube-api-access-db275") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "kube-api-access-db275". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.048688 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/187134d2-2fe9-4beb-beff-6a48162a1933-pod-info" (OuterVolumeSpecName: "pod-info") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.064786 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-config-data" (OuterVolumeSpecName: "config-data") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.092049 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-server-conf" (OuterVolumeSpecName: "server-conf") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.132341 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.132634 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-confd\") pod \"187134d2-2fe9-4beb-beff-6a48162a1933\" (UID: \"187134d2-2fe9-4beb-beff-6a48162a1933\") " Oct 09 10:48:38 crc kubenswrapper[4740]: W1009 10:48:38.132806 4740 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/187134d2-2fe9-4beb-beff-6a48162a1933/volumes/kubernetes.io~projected/rabbitmq-confd Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.132833 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "187134d2-2fe9-4beb-beff-6a48162a1933" (UID: "187134d2-2fe9-4beb-beff-6a48162a1933"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.133953 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.133985 4740 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/187134d2-2fe9-4beb-beff-6a48162a1933-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.134022 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.134035 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.134048 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db275\" (UniqueName: \"kubernetes.io/projected/187134d2-2fe9-4beb-beff-6a48162a1933-kube-api-access-db275\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.134105 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.134121 4740 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/187134d2-2fe9-4beb-beff-6a48162a1933-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.134133 4740 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/187134d2-2fe9-4beb-beff-6a48162a1933-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.152663 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.159380 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.236338 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.652430 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ff4b6585-91c6-48f8-ba40-5cd075c7c59e","Type":"ContainerStarted","Data":"643eb13842148eb4120a9e76dd3fe0fd8b8cdef25948c7f8e31e9bc519ff04ef"} Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.654686 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"187134d2-2fe9-4beb-beff-6a48162a1933","Type":"ContainerDied","Data":"e7cf6430c9bd0d2c75ac02f7ff81bee00e811aaba181b39cc97adae9e47a4677"} Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.654737 4740 scope.go:117] "RemoveContainer" containerID="94f0634b0ed255b557447062d5631f7ff62524a41768d6c0fafad907dce032a4" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.654809 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.676191 4740 scope.go:117] "RemoveContainer" containerID="bcd7f5081393f9b0fd83b07b79c9fd1569cb832e594a750001a732dca196c1c0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.694420 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.704841 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.721678 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:48:38 crc kubenswrapper[4740]: E1009 10:48:38.722044 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187134d2-2fe9-4beb-beff-6a48162a1933" containerName="rabbitmq" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.722061 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="187134d2-2fe9-4beb-beff-6a48162a1933" containerName="rabbitmq" Oct 09 10:48:38 crc kubenswrapper[4740]: E1009 10:48:38.722103 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187134d2-2fe9-4beb-beff-6a48162a1933" containerName="setup-container" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.722109 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="187134d2-2fe9-4beb-beff-6a48162a1933" containerName="setup-container" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.722265 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="187134d2-2fe9-4beb-beff-6a48162a1933" containerName="rabbitmq" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.723641 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.726034 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.726069 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.726119 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.726074 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.726254 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.726295 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t9rb9" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.726331 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.739476 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748340 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748422 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748523 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp24c\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-kube-api-access-kp24c\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748627 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748797 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748834 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748861 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748877 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748936 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.748979 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.749071 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850580 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp24c\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-kube-api-access-kp24c\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850663 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850771 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850801 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850827 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850849 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850884 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850910 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850957 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.850990 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.851028 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.852674 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.855098 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.855566 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.855781 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.855785 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.855888 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.856960 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.858711 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.858893 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.875076 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.875088 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp24c\" (UniqueName: \"kubernetes.io/projected/8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46-kube-api-access-kp24c\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:38 crc kubenswrapper[4740]: I1009 10:48:38.885419 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:39 crc kubenswrapper[4740]: I1009 10:48:39.054375 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:48:39 crc kubenswrapper[4740]: I1009 10:48:39.503204 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 10:48:39 crc kubenswrapper[4740]: W1009 10:48:39.537096 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ef6dccc_d229_4bb8_8fb2_5c6f859ecb46.slice/crio-aa682dc8b0941af2e3e76f709f0d07623fefd8de42914621a719aa64941ebafd WatchSource:0}: Error finding container aa682dc8b0941af2e3e76f709f0d07623fefd8de42914621a719aa64941ebafd: Status 404 returned error can't find the container with id aa682dc8b0941af2e3e76f709f0d07623fefd8de42914621a719aa64941ebafd Oct 09 10:48:39 crc kubenswrapper[4740]: I1009 10:48:39.667377 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46","Type":"ContainerStarted","Data":"aa682dc8b0941af2e3e76f709f0d07623fefd8de42914621a719aa64941ebafd"} Oct 09 10:48:39 crc kubenswrapper[4740]: I1009 10:48:39.771580 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187134d2-2fe9-4beb-beff-6a48162a1933" path="/var/lib/kubelet/pods/187134d2-2fe9-4beb-beff-6a48162a1933/volumes" Oct 09 10:48:39 crc kubenswrapper[4740]: I1009 10:48:39.995658 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-vlbfd"] Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.001038 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.003466 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-vlbfd"] Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.012206 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.184317 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jksfx\" (UniqueName: \"kubernetes.io/projected/43075b9d-c59f-4d19-adcc-662f573c1e48-kube-api-access-jksfx\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.184396 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-config\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.185035 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.185085 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.185268 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.185425 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-svc\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.185487 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.287722 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.287948 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.288106 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.288221 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-svc\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.288293 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.288397 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jksfx\" (UniqueName: \"kubernetes.io/projected/43075b9d-c59f-4d19-adcc-662f573c1e48-kube-api-access-jksfx\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.288531 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-config\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.289131 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-svc\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.289177 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.289408 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-config\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.289436 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.290066 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.290188 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.306647 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jksfx\" (UniqueName: \"kubernetes.io/projected/43075b9d-c59f-4d19-adcc-662f573c1e48-kube-api-access-jksfx\") pod \"dnsmasq-dns-67b789f86c-vlbfd\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.319691 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.681391 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ff4b6585-91c6-48f8-ba40-5cd075c7c59e","Type":"ContainerStarted","Data":"a90317d2448e23e76ce945d9e6c1b3eaed7f84716f816f497c7cadd999806b65"} Oct 09 10:48:40 crc kubenswrapper[4740]: I1009 10:48:40.768044 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-vlbfd"] Oct 09 10:48:40 crc kubenswrapper[4740]: W1009 10:48:40.933037 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43075b9d_c59f_4d19_adcc_662f573c1e48.slice/crio-07bdae9f2c28bbe5b25396e9926760cadea841a84424b839a800ebbb4325539f WatchSource:0}: Error finding container 07bdae9f2c28bbe5b25396e9926760cadea841a84424b839a800ebbb4325539f: Status 404 returned error can't find the container with id 07bdae9f2c28bbe5b25396e9926760cadea841a84424b839a800ebbb4325539f Oct 09 10:48:41 crc kubenswrapper[4740]: I1009 10:48:41.691845 4740 generic.go:334] "Generic (PLEG): container finished" podID="43075b9d-c59f-4d19-adcc-662f573c1e48" containerID="3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c" exitCode=0 Oct 09 10:48:41 crc kubenswrapper[4740]: I1009 10:48:41.691944 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" event={"ID":"43075b9d-c59f-4d19-adcc-662f573c1e48","Type":"ContainerDied","Data":"3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c"} Oct 09 10:48:41 crc kubenswrapper[4740]: I1009 10:48:41.692397 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" event={"ID":"43075b9d-c59f-4d19-adcc-662f573c1e48","Type":"ContainerStarted","Data":"07bdae9f2c28bbe5b25396e9926760cadea841a84424b839a800ebbb4325539f"} Oct 09 10:48:41 crc kubenswrapper[4740]: I1009 10:48:41.694934 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46","Type":"ContainerStarted","Data":"e58413c0f35ce6ce8a8c17ad231643075b44a8e6d7f34d660a341cb8dcac4ca3"} Oct 09 10:48:42 crc kubenswrapper[4740]: I1009 10:48:42.707816 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" event={"ID":"43075b9d-c59f-4d19-adcc-662f573c1e48","Type":"ContainerStarted","Data":"ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c"} Oct 09 10:48:42 crc kubenswrapper[4740]: I1009 10:48:42.738148 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" podStartSLOduration=3.738124271 podStartE2EDuration="3.738124271s" podCreationTimestamp="2025-10-09 10:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:48:42.732179538 +0000 UTC m=+1261.694379949" watchObservedRunningTime="2025-10-09 10:48:42.738124271 +0000 UTC m=+1261.700324672" Oct 09 10:48:43 crc kubenswrapper[4740]: I1009 10:48:43.716352 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.322105 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.381155 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-6h6zr"] Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.381427 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" podUID="bdee6391-3978-4f05-b3c6-a80276b6295f" containerName="dnsmasq-dns" containerID="cri-o://6b41f3337bebd89d9b228c81cb1dfb6df151ad00f2a6e62c895aa024d1415b04" gracePeriod=10 Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.530944 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-sq7qc"] Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.533072 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.542500 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-sq7qc"] Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.698409 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.698463 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.698602 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.698692 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.698832 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.698970 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-config\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.699036 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tzk7\" (UniqueName: \"kubernetes.io/projected/a952fe70-b037-4995-a678-b3da7312dcee-kube-api-access-2tzk7\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.800190 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.800260 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.800319 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.800386 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.800447 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.800500 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-config\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.800530 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tzk7\" (UniqueName: \"kubernetes.io/projected/a952fe70-b037-4995-a678-b3da7312dcee-kube-api-access-2tzk7\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.801560 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.801560 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.801981 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.802438 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-config\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.802481 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.802802 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a952fe70-b037-4995-a678-b3da7312dcee-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.804974 4740 generic.go:334] "Generic (PLEG): container finished" podID="bdee6391-3978-4f05-b3c6-a80276b6295f" containerID="6b41f3337bebd89d9b228c81cb1dfb6df151ad00f2a6e62c895aa024d1415b04" exitCode=0 Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.805007 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" event={"ID":"bdee6391-3978-4f05-b3c6-a80276b6295f","Type":"ContainerDied","Data":"6b41f3337bebd89d9b228c81cb1dfb6df151ad00f2a6e62c895aa024d1415b04"} Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.805032 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" event={"ID":"bdee6391-3978-4f05-b3c6-a80276b6295f","Type":"ContainerDied","Data":"014496efd3dba5e8e08537154d2b236b8419532dd5e728c517c1b695b0572e3e"} Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.805042 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="014496efd3dba5e8e08537154d2b236b8419532dd5e728c517c1b695b0572e3e" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.823497 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tzk7\" (UniqueName: \"kubernetes.io/projected/a952fe70-b037-4995-a678-b3da7312dcee-kube-api-access-2tzk7\") pod \"dnsmasq-dns-cb6ffcf87-sq7qc\" (UID: \"a952fe70-b037-4995-a678-b3da7312dcee\") " pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.862893 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:50 crc kubenswrapper[4740]: I1009 10:48:50.883476 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.007611 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-config\") pod \"bdee6391-3978-4f05-b3c6-a80276b6295f\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.007698 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-svc\") pod \"bdee6391-3978-4f05-b3c6-a80276b6295f\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.007833 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmbvq\" (UniqueName: \"kubernetes.io/projected/bdee6391-3978-4f05-b3c6-a80276b6295f-kube-api-access-cmbvq\") pod \"bdee6391-3978-4f05-b3c6-a80276b6295f\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.007886 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-nb\") pod \"bdee6391-3978-4f05-b3c6-a80276b6295f\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.007918 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-sb\") pod \"bdee6391-3978-4f05-b3c6-a80276b6295f\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.007946 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-swift-storage-0\") pod \"bdee6391-3978-4f05-b3c6-a80276b6295f\" (UID: \"bdee6391-3978-4f05-b3c6-a80276b6295f\") " Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.013523 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdee6391-3978-4f05-b3c6-a80276b6295f-kube-api-access-cmbvq" (OuterVolumeSpecName: "kube-api-access-cmbvq") pod "bdee6391-3978-4f05-b3c6-a80276b6295f" (UID: "bdee6391-3978-4f05-b3c6-a80276b6295f"). InnerVolumeSpecName "kube-api-access-cmbvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.094413 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bdee6391-3978-4f05-b3c6-a80276b6295f" (UID: "bdee6391-3978-4f05-b3c6-a80276b6295f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.094446 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bdee6391-3978-4f05-b3c6-a80276b6295f" (UID: "bdee6391-3978-4f05-b3c6-a80276b6295f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.102645 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-config" (OuterVolumeSpecName: "config") pod "bdee6391-3978-4f05-b3c6-a80276b6295f" (UID: "bdee6391-3978-4f05-b3c6-a80276b6295f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.110106 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.110135 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmbvq\" (UniqueName: \"kubernetes.io/projected/bdee6391-3978-4f05-b3c6-a80276b6295f-kube-api-access-cmbvq\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.110148 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.110162 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.114413 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bdee6391-3978-4f05-b3c6-a80276b6295f" (UID: "bdee6391-3978-4f05-b3c6-a80276b6295f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.128365 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bdee6391-3978-4f05-b3c6-a80276b6295f" (UID: "bdee6391-3978-4f05-b3c6-a80276b6295f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.212824 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.212858 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdee6391-3978-4f05-b3c6-a80276b6295f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.339595 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-sq7qc"] Oct 09 10:48:51 crc kubenswrapper[4740]: W1009 10:48:51.349870 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda952fe70_b037_4995_a678_b3da7312dcee.slice/crio-8f2c71a226269b7b4657d413e50457c7666d5eb7195a48f7007e6fbe5392e4ba WatchSource:0}: Error finding container 8f2c71a226269b7b4657d413e50457c7666d5eb7195a48f7007e6fbe5392e4ba: Status 404 returned error can't find the container with id 8f2c71a226269b7b4657d413e50457c7666d5eb7195a48f7007e6fbe5392e4ba Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.821290 4740 generic.go:334] "Generic (PLEG): container finished" podID="a952fe70-b037-4995-a678-b3da7312dcee" containerID="c623e8f86e13440e394e9d3eb5a4b045ada2fe997f65875bc62544eee59948c6" exitCode=0 Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.821377 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" event={"ID":"a952fe70-b037-4995-a678-b3da7312dcee","Type":"ContainerDied","Data":"c623e8f86e13440e394e9d3eb5a4b045ada2fe997f65875bc62544eee59948c6"} Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.821667 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-6h6zr" Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.821676 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" event={"ID":"a952fe70-b037-4995-a678-b3da7312dcee","Type":"ContainerStarted","Data":"8f2c71a226269b7b4657d413e50457c7666d5eb7195a48f7007e6fbe5392e4ba"} Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.876730 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-6h6zr"] Oct 09 10:48:51 crc kubenswrapper[4740]: I1009 10:48:51.885899 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-6h6zr"] Oct 09 10:48:52 crc kubenswrapper[4740]: I1009 10:48:52.836867 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" event={"ID":"a952fe70-b037-4995-a678-b3da7312dcee","Type":"ContainerStarted","Data":"33f6ac0583b3e584e485d0c67a201862d96a914e15f59c6a613f062cd1370f37"} Oct 09 10:48:52 crc kubenswrapper[4740]: I1009 10:48:52.839414 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:48:52 crc kubenswrapper[4740]: I1009 10:48:52.860689 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" podStartSLOduration=2.8606705999999997 podStartE2EDuration="2.8606706s" podCreationTimestamp="2025-10-09 10:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:48:52.859690733 +0000 UTC m=+1271.821891144" watchObservedRunningTime="2025-10-09 10:48:52.8606706 +0000 UTC m=+1271.822870981" Oct 09 10:48:53 crc kubenswrapper[4740]: I1009 10:48:53.771416 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdee6391-3978-4f05-b3c6-a80276b6295f" path="/var/lib/kubelet/pods/bdee6391-3978-4f05-b3c6-a80276b6295f/volumes" Oct 09 10:48:58 crc kubenswrapper[4740]: I1009 10:48:58.075556 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-66569d88ff-tjljh" podUID="501b9024-4f9f-41eb-ae73-d9ecb0637363" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 09 10:49:00 crc kubenswrapper[4740]: I1009 10:49:00.865029 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-sq7qc" Oct 09 10:49:00 crc kubenswrapper[4740]: I1009 10:49:00.949930 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-vlbfd"] Oct 09 10:49:00 crc kubenswrapper[4740]: I1009 10:49:00.950192 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" podUID="43075b9d-c59f-4d19-adcc-662f573c1e48" containerName="dnsmasq-dns" containerID="cri-o://ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c" gracePeriod=10 Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.553928 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.639579 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-sb\") pod \"43075b9d-c59f-4d19-adcc-662f573c1e48\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.639679 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-svc\") pod \"43075b9d-c59f-4d19-adcc-662f573c1e48\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.639801 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jksfx\" (UniqueName: \"kubernetes.io/projected/43075b9d-c59f-4d19-adcc-662f573c1e48-kube-api-access-jksfx\") pod \"43075b9d-c59f-4d19-adcc-662f573c1e48\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.639828 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-config\") pod \"43075b9d-c59f-4d19-adcc-662f573c1e48\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.639846 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-swift-storage-0\") pod \"43075b9d-c59f-4d19-adcc-662f573c1e48\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.639879 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-nb\") pod \"43075b9d-c59f-4d19-adcc-662f573c1e48\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.639943 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-openstack-edpm-ipam\") pod \"43075b9d-c59f-4d19-adcc-662f573c1e48\" (UID: \"43075b9d-c59f-4d19-adcc-662f573c1e48\") " Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.661874 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43075b9d-c59f-4d19-adcc-662f573c1e48-kube-api-access-jksfx" (OuterVolumeSpecName: "kube-api-access-jksfx") pod "43075b9d-c59f-4d19-adcc-662f573c1e48" (UID: "43075b9d-c59f-4d19-adcc-662f573c1e48"). InnerVolumeSpecName "kube-api-access-jksfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.695178 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43075b9d-c59f-4d19-adcc-662f573c1e48" (UID: "43075b9d-c59f-4d19-adcc-662f573c1e48"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.695270 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "43075b9d-c59f-4d19-adcc-662f573c1e48" (UID: "43075b9d-c59f-4d19-adcc-662f573c1e48"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.696507 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-config" (OuterVolumeSpecName: "config") pod "43075b9d-c59f-4d19-adcc-662f573c1e48" (UID: "43075b9d-c59f-4d19-adcc-662f573c1e48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.697457 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43075b9d-c59f-4d19-adcc-662f573c1e48" (UID: "43075b9d-c59f-4d19-adcc-662f573c1e48"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.700283 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43075b9d-c59f-4d19-adcc-662f573c1e48" (UID: "43075b9d-c59f-4d19-adcc-662f573c1e48"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.710349 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "43075b9d-c59f-4d19-adcc-662f573c1e48" (UID: "43075b9d-c59f-4d19-adcc-662f573c1e48"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.742863 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.743108 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.743167 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jksfx\" (UniqueName: \"kubernetes.io/projected/43075b9d-c59f-4d19-adcc-662f573c1e48-kube-api-access-jksfx\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.743225 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-config\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.743274 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.743322 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.743373 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/43075b9d-c59f-4d19-adcc-662f573c1e48-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.981403 4740 generic.go:334] "Generic (PLEG): container finished" podID="43075b9d-c59f-4d19-adcc-662f573c1e48" containerID="ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c" exitCode=0 Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.981479 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.981486 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" event={"ID":"43075b9d-c59f-4d19-adcc-662f573c1e48","Type":"ContainerDied","Data":"ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c"} Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.981560 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-vlbfd" event={"ID":"43075b9d-c59f-4d19-adcc-662f573c1e48","Type":"ContainerDied","Data":"07bdae9f2c28bbe5b25396e9926760cadea841a84424b839a800ebbb4325539f"} Oct 09 10:49:01 crc kubenswrapper[4740]: I1009 10:49:01.981590 4740 scope.go:117] "RemoveContainer" containerID="ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c" Oct 09 10:49:02 crc kubenswrapper[4740]: I1009 10:49:02.006370 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-vlbfd"] Oct 09 10:49:02 crc kubenswrapper[4740]: I1009 10:49:02.007167 4740 scope.go:117] "RemoveContainer" containerID="3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c" Oct 09 10:49:02 crc kubenswrapper[4740]: I1009 10:49:02.014453 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-vlbfd"] Oct 09 10:49:02 crc kubenswrapper[4740]: I1009 10:49:02.037021 4740 scope.go:117] "RemoveContainer" containerID="ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c" Oct 09 10:49:02 crc kubenswrapper[4740]: E1009 10:49:02.037483 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c\": container with ID starting with ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c not found: ID does not exist" containerID="ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c" Oct 09 10:49:02 crc kubenswrapper[4740]: I1009 10:49:02.037542 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c"} err="failed to get container status \"ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c\": rpc error: code = NotFound desc = could not find container \"ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c\": container with ID starting with ddeaad8ab5e7803c3ee5a7b42900c11b562048b21466b029b6443d479cc3bd5c not found: ID does not exist" Oct 09 10:49:02 crc kubenswrapper[4740]: I1009 10:49:02.037602 4740 scope.go:117] "RemoveContainer" containerID="3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c" Oct 09 10:49:02 crc kubenswrapper[4740]: E1009 10:49:02.038223 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c\": container with ID starting with 3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c not found: ID does not exist" containerID="3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c" Oct 09 10:49:02 crc kubenswrapper[4740]: I1009 10:49:02.038253 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c"} err="failed to get container status \"3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c\": rpc error: code = NotFound desc = could not find container \"3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c\": container with ID starting with 3761178146b4e22c88b4238e14ad06bc3a1085f0177255ce3bf3b076fd79130c not found: ID does not exist" Oct 09 10:49:03 crc kubenswrapper[4740]: I1009 10:49:03.766725 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43075b9d-c59f-4d19-adcc-662f573c1e48" path="/var/lib/kubelet/pods/43075b9d-c59f-4d19-adcc-662f573c1e48/volumes" Oct 09 10:49:05 crc kubenswrapper[4740]: I1009 10:49:05.407459 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:49:05 crc kubenswrapper[4740]: I1009 10:49:05.407520 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:49:13 crc kubenswrapper[4740]: I1009 10:49:13.100717 4740 generic.go:334] "Generic (PLEG): container finished" podID="ff4b6585-91c6-48f8-ba40-5cd075c7c59e" containerID="a90317d2448e23e76ce945d9e6c1b3eaed7f84716f816f497c7cadd999806b65" exitCode=0 Oct 09 10:49:13 crc kubenswrapper[4740]: I1009 10:49:13.100822 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ff4b6585-91c6-48f8-ba40-5cd075c7c59e","Type":"ContainerDied","Data":"a90317d2448e23e76ce945d9e6c1b3eaed7f84716f816f497c7cadd999806b65"} Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.111300 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ff4b6585-91c6-48f8-ba40-5cd075c7c59e","Type":"ContainerStarted","Data":"8adb99ecffcae3c4b16caba897d10e0d6ebf33878c4dd3029668416c8a843436"} Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.111925 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.112900 4740 generic.go:334] "Generic (PLEG): container finished" podID="8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46" containerID="e58413c0f35ce6ce8a8c17ad231643075b44a8e6d7f34d660a341cb8dcac4ca3" exitCode=0 Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.112948 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46","Type":"ContainerDied","Data":"e58413c0f35ce6ce8a8c17ad231643075b44a8e6d7f34d660a341cb8dcac4ca3"} Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.188745 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.188718165 podStartE2EDuration="38.188718165s" podCreationTimestamp="2025-10-09 10:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:49:14.154403247 +0000 UTC m=+1293.116603628" watchObservedRunningTime="2025-10-09 10:49:14.188718165 +0000 UTC m=+1293.150918566" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.469590 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s"] Oct 09 10:49:14 crc kubenswrapper[4740]: E1009 10:49:14.469962 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43075b9d-c59f-4d19-adcc-662f573c1e48" containerName="init" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.469981 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="43075b9d-c59f-4d19-adcc-662f573c1e48" containerName="init" Oct 09 10:49:14 crc kubenswrapper[4740]: E1009 10:49:14.470004 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdee6391-3978-4f05-b3c6-a80276b6295f" containerName="dnsmasq-dns" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.470010 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdee6391-3978-4f05-b3c6-a80276b6295f" containerName="dnsmasq-dns" Oct 09 10:49:14 crc kubenswrapper[4740]: E1009 10:49:14.470024 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdee6391-3978-4f05-b3c6-a80276b6295f" containerName="init" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.470030 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdee6391-3978-4f05-b3c6-a80276b6295f" containerName="init" Oct 09 10:49:14 crc kubenswrapper[4740]: E1009 10:49:14.470048 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43075b9d-c59f-4d19-adcc-662f573c1e48" containerName="dnsmasq-dns" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.470054 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="43075b9d-c59f-4d19-adcc-662f573c1e48" containerName="dnsmasq-dns" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.470238 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="43075b9d-c59f-4d19-adcc-662f573c1e48" containerName="dnsmasq-dns" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.470248 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdee6391-3978-4f05-b3c6-a80276b6295f" containerName="dnsmasq-dns" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.470858 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.481882 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.482097 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.482242 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.482409 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.505211 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s"] Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.614668 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.615522 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.615706 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.615882 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7qw\" (UniqueName: \"kubernetes.io/projected/945008af-c262-4581-8f40-51b8fe5a9dd8-kube-api-access-7p7qw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.718222 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.718312 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.718357 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7qw\" (UniqueName: \"kubernetes.io/projected/945008af-c262-4581-8f40-51b8fe5a9dd8-kube-api-access-7p7qw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.718460 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.723800 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.724047 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.725132 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.740237 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7qw\" (UniqueName: \"kubernetes.io/projected/945008af-c262-4581-8f40-51b8fe5a9dd8-kube-api-access-7p7qw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:14 crc kubenswrapper[4740]: I1009 10:49:14.804090 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:15 crc kubenswrapper[4740]: I1009 10:49:15.122877 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46","Type":"ContainerStarted","Data":"d406a155e5d8b983ecf390733420cb90a66a4f87cef4530fbea89606e6d69ba2"} Oct 09 10:49:15 crc kubenswrapper[4740]: I1009 10:49:15.157938 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.157892341 podStartE2EDuration="37.157892341s" podCreationTimestamp="2025-10-09 10:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 10:49:15.156815582 +0000 UTC m=+1294.119015993" watchObservedRunningTime="2025-10-09 10:49:15.157892341 +0000 UTC m=+1294.120092722" Oct 09 10:49:15 crc kubenswrapper[4740]: I1009 10:49:15.295513 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s"] Oct 09 10:49:15 crc kubenswrapper[4740]: I1009 10:49:15.304266 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 10:49:16 crc kubenswrapper[4740]: I1009 10:49:16.133540 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" event={"ID":"945008af-c262-4581-8f40-51b8fe5a9dd8","Type":"ContainerStarted","Data":"c9887b3cbed40b35d6d4d9060d58b357f5309464cea86f4fee884c856ebeb8ec"} Oct 09 10:49:19 crc kubenswrapper[4740]: I1009 10:49:19.054457 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:49:24 crc kubenswrapper[4740]: I1009 10:49:24.222324 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" event={"ID":"945008af-c262-4581-8f40-51b8fe5a9dd8","Type":"ContainerStarted","Data":"3bb79a824096b017be48adcd9f4b381f30a6f022330d2cdd43d413654e42fca1"} Oct 09 10:49:24 crc kubenswrapper[4740]: I1009 10:49:24.245633 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" podStartSLOduration=2.126891278 podStartE2EDuration="10.245614397s" podCreationTimestamp="2025-10-09 10:49:14 +0000 UTC" firstStartedPulling="2025-10-09 10:49:15.304053308 +0000 UTC m=+1294.266253689" lastFinishedPulling="2025-10-09 10:49:23.422776417 +0000 UTC m=+1302.384976808" observedRunningTime="2025-10-09 10:49:24.239237665 +0000 UTC m=+1303.201438046" watchObservedRunningTime="2025-10-09 10:49:24.245614397 +0000 UTC m=+1303.207814768" Oct 09 10:49:27 crc kubenswrapper[4740]: I1009 10:49:27.630902 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 10:49:29 crc kubenswrapper[4740]: I1009 10:49:29.057925 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 10:49:35 crc kubenswrapper[4740]: I1009 10:49:35.340886 4740 generic.go:334] "Generic (PLEG): container finished" podID="945008af-c262-4581-8f40-51b8fe5a9dd8" containerID="3bb79a824096b017be48adcd9f4b381f30a6f022330d2cdd43d413654e42fca1" exitCode=0 Oct 09 10:49:35 crc kubenswrapper[4740]: I1009 10:49:35.340962 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" event={"ID":"945008af-c262-4581-8f40-51b8fe5a9dd8","Type":"ContainerDied","Data":"3bb79a824096b017be48adcd9f4b381f30a6f022330d2cdd43d413654e42fca1"} Oct 09 10:49:35 crc kubenswrapper[4740]: I1009 10:49:35.407609 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:49:35 crc kubenswrapper[4740]: I1009 10:49:35.407664 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.781431 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.934354 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-repo-setup-combined-ca-bundle\") pod \"945008af-c262-4581-8f40-51b8fe5a9dd8\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.936214 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p7qw\" (UniqueName: \"kubernetes.io/projected/945008af-c262-4581-8f40-51b8fe5a9dd8-kube-api-access-7p7qw\") pod \"945008af-c262-4581-8f40-51b8fe5a9dd8\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.936450 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-inventory\") pod \"945008af-c262-4581-8f40-51b8fe5a9dd8\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.936555 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-ssh-key\") pod \"945008af-c262-4581-8f40-51b8fe5a9dd8\" (UID: \"945008af-c262-4581-8f40-51b8fe5a9dd8\") " Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.946888 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "945008af-c262-4581-8f40-51b8fe5a9dd8" (UID: "945008af-c262-4581-8f40-51b8fe5a9dd8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.946933 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945008af-c262-4581-8f40-51b8fe5a9dd8-kube-api-access-7p7qw" (OuterVolumeSpecName: "kube-api-access-7p7qw") pod "945008af-c262-4581-8f40-51b8fe5a9dd8" (UID: "945008af-c262-4581-8f40-51b8fe5a9dd8"). InnerVolumeSpecName "kube-api-access-7p7qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.966130 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-inventory" (OuterVolumeSpecName: "inventory") pod "945008af-c262-4581-8f40-51b8fe5a9dd8" (UID: "945008af-c262-4581-8f40-51b8fe5a9dd8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:49:36 crc kubenswrapper[4740]: I1009 10:49:36.967779 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "945008af-c262-4581-8f40-51b8fe5a9dd8" (UID: "945008af-c262-4581-8f40-51b8fe5a9dd8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.039168 4740 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.039211 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p7qw\" (UniqueName: \"kubernetes.io/projected/945008af-c262-4581-8f40-51b8fe5a9dd8-kube-api-access-7p7qw\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.039224 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.039234 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/945008af-c262-4581-8f40-51b8fe5a9dd8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.368900 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" event={"ID":"945008af-c262-4581-8f40-51b8fe5a9dd8","Type":"ContainerDied","Data":"c9887b3cbed40b35d6d4d9060d58b357f5309464cea86f4fee884c856ebeb8ec"} Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.368945 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9887b3cbed40b35d6d4d9060d58b357f5309464cea86f4fee884c856ebeb8ec" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.369024 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.476225 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr"] Oct 09 10:49:37 crc kubenswrapper[4740]: E1009 10:49:37.476593 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945008af-c262-4581-8f40-51b8fe5a9dd8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.476611 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="945008af-c262-4581-8f40-51b8fe5a9dd8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.476825 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="945008af-c262-4581-8f40-51b8fe5a9dd8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.477587 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.479512 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.480290 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.480368 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.481689 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.493506 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr"] Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.548108 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.548166 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.548194 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l42l\" (UniqueName: \"kubernetes.io/projected/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-kube-api-access-5l42l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.650621 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.650715 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.650778 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l42l\" (UniqueName: \"kubernetes.io/projected/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-kube-api-access-5l42l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.655508 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.655864 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.672124 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l42l\" (UniqueName: \"kubernetes.io/projected/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-kube-api-access-5l42l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4jjwr\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:37 crc kubenswrapper[4740]: I1009 10:49:37.797283 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:38 crc kubenswrapper[4740]: I1009 10:49:38.315394 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr"] Oct 09 10:49:38 crc kubenswrapper[4740]: W1009 10:49:38.323452 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bb88dfa_ffc6_433a_9df2_f00e2a6805e7.slice/crio-52fdba5520e0dc77a10c8181fcd92eb1cc1adf3bf9ffcee456907312185a0600 WatchSource:0}: Error finding container 52fdba5520e0dc77a10c8181fcd92eb1cc1adf3bf9ffcee456907312185a0600: Status 404 returned error can't find the container with id 52fdba5520e0dc77a10c8181fcd92eb1cc1adf3bf9ffcee456907312185a0600 Oct 09 10:49:38 crc kubenswrapper[4740]: I1009 10:49:38.379729 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" event={"ID":"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7","Type":"ContainerStarted","Data":"52fdba5520e0dc77a10c8181fcd92eb1cc1adf3bf9ffcee456907312185a0600"} Oct 09 10:49:39 crc kubenswrapper[4740]: I1009 10:49:39.393078 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" event={"ID":"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7","Type":"ContainerStarted","Data":"df4df811b4677e7dd668650727e2c5a70fe71f778c7a7cdefb05a304375411f5"} Oct 09 10:49:39 crc kubenswrapper[4740]: I1009 10:49:39.409806 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" podStartSLOduration=1.738118662 podStartE2EDuration="2.409789148s" podCreationTimestamp="2025-10-09 10:49:37 +0000 UTC" firstStartedPulling="2025-10-09 10:49:38.327877883 +0000 UTC m=+1317.290078254" lastFinishedPulling="2025-10-09 10:49:38.999548349 +0000 UTC m=+1317.961748740" observedRunningTime="2025-10-09 10:49:39.407334912 +0000 UTC m=+1318.369535293" watchObservedRunningTime="2025-10-09 10:49:39.409789148 +0000 UTC m=+1318.371989519" Oct 09 10:49:42 crc kubenswrapper[4740]: I1009 10:49:42.424190 4740 generic.go:334] "Generic (PLEG): container finished" podID="1bb88dfa-ffc6-433a-9df2-f00e2a6805e7" containerID="df4df811b4677e7dd668650727e2c5a70fe71f778c7a7cdefb05a304375411f5" exitCode=0 Oct 09 10:49:42 crc kubenswrapper[4740]: I1009 10:49:42.424250 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" event={"ID":"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7","Type":"ContainerDied","Data":"df4df811b4677e7dd668650727e2c5a70fe71f778c7a7cdefb05a304375411f5"} Oct 09 10:49:43 crc kubenswrapper[4740]: I1009 10:49:43.882112 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:43 crc kubenswrapper[4740]: I1009 10:49:43.994506 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-ssh-key\") pod \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " Oct 09 10:49:43 crc kubenswrapper[4740]: I1009 10:49:43.994826 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-inventory\") pod \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " Oct 09 10:49:43 crc kubenswrapper[4740]: I1009 10:49:43.994990 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l42l\" (UniqueName: \"kubernetes.io/projected/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-kube-api-access-5l42l\") pod \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\" (UID: \"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7\") " Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.002058 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-kube-api-access-5l42l" (OuterVolumeSpecName: "kube-api-access-5l42l") pod "1bb88dfa-ffc6-433a-9df2-f00e2a6805e7" (UID: "1bb88dfa-ffc6-433a-9df2-f00e2a6805e7"). InnerVolumeSpecName "kube-api-access-5l42l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.020547 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1bb88dfa-ffc6-433a-9df2-f00e2a6805e7" (UID: "1bb88dfa-ffc6-433a-9df2-f00e2a6805e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.040536 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-inventory" (OuterVolumeSpecName: "inventory") pod "1bb88dfa-ffc6-433a-9df2-f00e2a6805e7" (UID: "1bb88dfa-ffc6-433a-9df2-f00e2a6805e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.097080 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.097112 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l42l\" (UniqueName: \"kubernetes.io/projected/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-kube-api-access-5l42l\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.097125 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bb88dfa-ffc6-433a-9df2-f00e2a6805e7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.453156 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" event={"ID":"1bb88dfa-ffc6-433a-9df2-f00e2a6805e7","Type":"ContainerDied","Data":"52fdba5520e0dc77a10c8181fcd92eb1cc1adf3bf9ffcee456907312185a0600"} Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.453210 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52fdba5520e0dc77a10c8181fcd92eb1cc1adf3bf9ffcee456907312185a0600" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.453322 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4jjwr" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.579284 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8"] Oct 09 10:49:44 crc kubenswrapper[4740]: E1009 10:49:44.579774 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb88dfa-ffc6-433a-9df2-f00e2a6805e7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.579798 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb88dfa-ffc6-433a-9df2-f00e2a6805e7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.580050 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb88dfa-ffc6-433a-9df2-f00e2a6805e7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.580770 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.583236 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.583309 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.583606 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.584558 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.586783 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8"] Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.707396 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.707464 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.707655 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.707841 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnkk\" (UniqueName: \"kubernetes.io/projected/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-kube-api-access-fnnkk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.809630 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.809710 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.809781 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnkk\" (UniqueName: \"kubernetes.io/projected/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-kube-api-access-fnnkk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.809881 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.813281 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.815197 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.815494 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.827087 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnkk\" (UniqueName: \"kubernetes.io/projected/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-kube-api-access-fnnkk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:44 crc kubenswrapper[4740]: I1009 10:49:44.906236 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:49:45 crc kubenswrapper[4740]: I1009 10:49:45.482482 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8"] Oct 09 10:49:46 crc kubenswrapper[4740]: I1009 10:49:46.475482 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" event={"ID":"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8","Type":"ContainerStarted","Data":"0dc29f55d35cf6181b6d47400aabd409e5ae774686a78de844b2289800963038"} Oct 09 10:49:46 crc kubenswrapper[4740]: I1009 10:49:46.475920 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" event={"ID":"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8","Type":"ContainerStarted","Data":"b75ca866ae868ce172fc895b62f28fafbc230f9ed216c4720438a1712ab02beb"} Oct 09 10:49:46 crc kubenswrapper[4740]: I1009 10:49:46.497242 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" podStartSLOduration=2.0802319479999998 podStartE2EDuration="2.497223319s" podCreationTimestamp="2025-10-09 10:49:44 +0000 UTC" firstStartedPulling="2025-10-09 10:49:45.49121709 +0000 UTC m=+1324.453417481" lastFinishedPulling="2025-10-09 10:49:45.908208441 +0000 UTC m=+1324.870408852" observedRunningTime="2025-10-09 10:49:46.494469454 +0000 UTC m=+1325.456669835" watchObservedRunningTime="2025-10-09 10:49:46.497223319 +0000 UTC m=+1325.459423700" Oct 09 10:50:05 crc kubenswrapper[4740]: I1009 10:50:05.407642 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:50:05 crc kubenswrapper[4740]: I1009 10:50:05.409900 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:50:05 crc kubenswrapper[4740]: I1009 10:50:05.410109 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:50:05 crc kubenswrapper[4740]: I1009 10:50:05.412323 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"198944336c8d712e7c21457778dd2b3f352b6a523b8c1f1ec0b48f4c6d926ff3"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 10:50:05 crc kubenswrapper[4740]: I1009 10:50:05.412398 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://198944336c8d712e7c21457778dd2b3f352b6a523b8c1f1ec0b48f4c6d926ff3" gracePeriod=600 Oct 09 10:50:05 crc kubenswrapper[4740]: I1009 10:50:05.662600 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="198944336c8d712e7c21457778dd2b3f352b6a523b8c1f1ec0b48f4c6d926ff3" exitCode=0 Oct 09 10:50:05 crc kubenswrapper[4740]: I1009 10:50:05.662618 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"198944336c8d712e7c21457778dd2b3f352b6a523b8c1f1ec0b48f4c6d926ff3"} Oct 09 10:50:05 crc kubenswrapper[4740]: I1009 10:50:05.663029 4740 scope.go:117] "RemoveContainer" containerID="fbbd1d786738a0dbe0197a069ad3e53334cad14f3901ee957620b2bd7f765083" Oct 09 10:50:06 crc kubenswrapper[4740]: I1009 10:50:06.676582 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562"} Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.711122 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6tbdk"] Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.715717 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.728673 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6tbdk"] Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.741544 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpp2s\" (UniqueName: \"kubernetes.io/projected/0730fcb7-39c4-40f2-b202-f92eb0de3091-kube-api-access-bpp2s\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.741837 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-catalog-content\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.741958 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-utilities\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.844251 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-catalog-content\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.844331 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-utilities\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.844383 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpp2s\" (UniqueName: \"kubernetes.io/projected/0730fcb7-39c4-40f2-b202-f92eb0de3091-kube-api-access-bpp2s\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.844877 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-catalog-content\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.844933 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-utilities\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:42 crc kubenswrapper[4740]: I1009 10:50:42.882137 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpp2s\" (UniqueName: \"kubernetes.io/projected/0730fcb7-39c4-40f2-b202-f92eb0de3091-kube-api-access-bpp2s\") pod \"redhat-operators-6tbdk\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:43 crc kubenswrapper[4740]: I1009 10:50:43.049067 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:43 crc kubenswrapper[4740]: I1009 10:50:43.511935 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6tbdk"] Oct 09 10:50:43 crc kubenswrapper[4740]: W1009 10:50:43.519932 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0730fcb7_39c4_40f2_b202_f92eb0de3091.slice/crio-f2e31a0d32557ba3eee4120c80c06d90473bd09183a547a7a7a628ecbf6e15f7 WatchSource:0}: Error finding container f2e31a0d32557ba3eee4120c80c06d90473bd09183a547a7a7a628ecbf6e15f7: Status 404 returned error can't find the container with id f2e31a0d32557ba3eee4120c80c06d90473bd09183a547a7a7a628ecbf6e15f7 Oct 09 10:50:44 crc kubenswrapper[4740]: I1009 10:50:44.079842 4740 generic.go:334] "Generic (PLEG): container finished" podID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerID="d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2" exitCode=0 Oct 09 10:50:44 crc kubenswrapper[4740]: I1009 10:50:44.079909 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tbdk" event={"ID":"0730fcb7-39c4-40f2-b202-f92eb0de3091","Type":"ContainerDied","Data":"d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2"} Oct 09 10:50:44 crc kubenswrapper[4740]: I1009 10:50:44.079978 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tbdk" event={"ID":"0730fcb7-39c4-40f2-b202-f92eb0de3091","Type":"ContainerStarted","Data":"f2e31a0d32557ba3eee4120c80c06d90473bd09183a547a7a7a628ecbf6e15f7"} Oct 09 10:50:45 crc kubenswrapper[4740]: I1009 10:50:45.091739 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tbdk" event={"ID":"0730fcb7-39c4-40f2-b202-f92eb0de3091","Type":"ContainerStarted","Data":"6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2"} Oct 09 10:50:46 crc kubenswrapper[4740]: I1009 10:50:46.104726 4740 generic.go:334] "Generic (PLEG): container finished" podID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerID="6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2" exitCode=0 Oct 09 10:50:46 crc kubenswrapper[4740]: I1009 10:50:46.104872 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tbdk" event={"ID":"0730fcb7-39c4-40f2-b202-f92eb0de3091","Type":"ContainerDied","Data":"6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2"} Oct 09 10:50:47 crc kubenswrapper[4740]: I1009 10:50:47.120709 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tbdk" event={"ID":"0730fcb7-39c4-40f2-b202-f92eb0de3091","Type":"ContainerStarted","Data":"65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a"} Oct 09 10:50:47 crc kubenswrapper[4740]: I1009 10:50:47.146955 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6tbdk" podStartSLOduration=2.603229471 podStartE2EDuration="5.14692574s" podCreationTimestamp="2025-10-09 10:50:42 +0000 UTC" firstStartedPulling="2025-10-09 10:50:44.08273515 +0000 UTC m=+1383.044935531" lastFinishedPulling="2025-10-09 10:50:46.626431419 +0000 UTC m=+1385.588631800" observedRunningTime="2025-10-09 10:50:47.135724358 +0000 UTC m=+1386.097924749" watchObservedRunningTime="2025-10-09 10:50:47.14692574 +0000 UTC m=+1386.109126161" Oct 09 10:50:53 crc kubenswrapper[4740]: I1009 10:50:53.049299 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:53 crc kubenswrapper[4740]: I1009 10:50:53.049892 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:53 crc kubenswrapper[4740]: I1009 10:50:53.096677 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:53 crc kubenswrapper[4740]: I1009 10:50:53.229627 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:53 crc kubenswrapper[4740]: I1009 10:50:53.331806 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6tbdk"] Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.229343 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6tbdk" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerName="registry-server" containerID="cri-o://65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a" gracePeriod=2 Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.684634 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.794482 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-utilities\") pod \"0730fcb7-39c4-40f2-b202-f92eb0de3091\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.794644 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpp2s\" (UniqueName: \"kubernetes.io/projected/0730fcb7-39c4-40f2-b202-f92eb0de3091-kube-api-access-bpp2s\") pod \"0730fcb7-39c4-40f2-b202-f92eb0de3091\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.794743 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-catalog-content\") pod \"0730fcb7-39c4-40f2-b202-f92eb0de3091\" (UID: \"0730fcb7-39c4-40f2-b202-f92eb0de3091\") " Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.795671 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-utilities" (OuterVolumeSpecName: "utilities") pod "0730fcb7-39c4-40f2-b202-f92eb0de3091" (UID: "0730fcb7-39c4-40f2-b202-f92eb0de3091"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.801173 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0730fcb7-39c4-40f2-b202-f92eb0de3091-kube-api-access-bpp2s" (OuterVolumeSpecName: "kube-api-access-bpp2s") pod "0730fcb7-39c4-40f2-b202-f92eb0de3091" (UID: "0730fcb7-39c4-40f2-b202-f92eb0de3091"). InnerVolumeSpecName "kube-api-access-bpp2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.888813 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0730fcb7-39c4-40f2-b202-f92eb0de3091" (UID: "0730fcb7-39c4-40f2-b202-f92eb0de3091"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.897439 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpp2s\" (UniqueName: \"kubernetes.io/projected/0730fcb7-39c4-40f2-b202-f92eb0de3091-kube-api-access-bpp2s\") on node \"crc\" DevicePath \"\"" Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.897470 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:50:55 crc kubenswrapper[4740]: I1009 10:50:55.897482 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0730fcb7-39c4-40f2-b202-f92eb0de3091-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.243780 4740 generic.go:334] "Generic (PLEG): container finished" podID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerID="65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a" exitCode=0 Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.243827 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tbdk" event={"ID":"0730fcb7-39c4-40f2-b202-f92eb0de3091","Type":"ContainerDied","Data":"65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a"} Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.243853 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tbdk" event={"ID":"0730fcb7-39c4-40f2-b202-f92eb0de3091","Type":"ContainerDied","Data":"f2e31a0d32557ba3eee4120c80c06d90473bd09183a547a7a7a628ecbf6e15f7"} Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.243870 4740 scope.go:117] "RemoveContainer" containerID="65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.243876 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tbdk" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.274450 4740 scope.go:117] "RemoveContainer" containerID="6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.288997 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6tbdk"] Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.297244 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6tbdk"] Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.324962 4740 scope.go:117] "RemoveContainer" containerID="d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.369108 4740 scope.go:117] "RemoveContainer" containerID="65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a" Oct 09 10:50:56 crc kubenswrapper[4740]: E1009 10:50:56.369514 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a\": container with ID starting with 65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a not found: ID does not exist" containerID="65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.369556 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a"} err="failed to get container status \"65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a\": rpc error: code = NotFound desc = could not find container \"65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a\": container with ID starting with 65470cdbb87951b04743814395a739e46c6ad71136adcca567f2ca035371d07a not found: ID does not exist" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.369593 4740 scope.go:117] "RemoveContainer" containerID="6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2" Oct 09 10:50:56 crc kubenswrapper[4740]: E1009 10:50:56.369952 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2\": container with ID starting with 6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2 not found: ID does not exist" containerID="6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.370031 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2"} err="failed to get container status \"6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2\": rpc error: code = NotFound desc = could not find container \"6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2\": container with ID starting with 6d289f7d4a0228911b3c6ea5cd5fc292b2996c47235b6860ad1ef128a574a4f2 not found: ID does not exist" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.370056 4740 scope.go:117] "RemoveContainer" containerID="d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2" Oct 09 10:50:56 crc kubenswrapper[4740]: E1009 10:50:56.370452 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2\": container with ID starting with d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2 not found: ID does not exist" containerID="d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2" Oct 09 10:50:56 crc kubenswrapper[4740]: I1009 10:50:56.370512 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2"} err="failed to get container status \"d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2\": rpc error: code = NotFound desc = could not find container \"d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2\": container with ID starting with d6636f2543b7ff4ff8c98793985b068f1f706a9e5edfc44776673b1e57fd1ce2 not found: ID does not exist" Oct 09 10:50:57 crc kubenswrapper[4740]: I1009 10:50:57.765063 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" path="/var/lib/kubelet/pods/0730fcb7-39c4-40f2-b202-f92eb0de3091/volumes" Oct 09 10:51:09 crc kubenswrapper[4740]: I1009 10:51:09.825104 4740 scope.go:117] "RemoveContainer" containerID="bf39e63bc4d955c62d5b2326fc62b360c447c4c352a8032b2f5e14e8ff7a2b5a" Oct 09 10:51:09 crc kubenswrapper[4740]: I1009 10:51:09.854987 4740 scope.go:117] "RemoveContainer" containerID="ee2fd56da6d03fafffab9c3ee1e97e3ee4fd8e9a23ca462b1ed0428116c0d9cb" Oct 09 10:51:09 crc kubenswrapper[4740]: I1009 10:51:09.906966 4740 scope.go:117] "RemoveContainer" containerID="7178066fd7948f6c4b1b3fd996cad0ff8346b3903f2052712eeecd883e501538" Oct 09 10:51:27 crc kubenswrapper[4740]: I1009 10:51:27.972728 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-56dpf"] Oct 09 10:51:27 crc kubenswrapper[4740]: E1009 10:51:27.975024 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerName="extract-content" Oct 09 10:51:27 crc kubenswrapper[4740]: I1009 10:51:27.975163 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerName="extract-content" Oct 09 10:51:27 crc kubenswrapper[4740]: E1009 10:51:27.975280 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerName="extract-utilities" Oct 09 10:51:27 crc kubenswrapper[4740]: I1009 10:51:27.975360 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerName="extract-utilities" Oct 09 10:51:27 crc kubenswrapper[4740]: E1009 10:51:27.975439 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerName="registry-server" Oct 09 10:51:27 crc kubenswrapper[4740]: I1009 10:51:27.975511 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerName="registry-server" Oct 09 10:51:27 crc kubenswrapper[4740]: I1009 10:51:27.975876 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0730fcb7-39c4-40f2-b202-f92eb0de3091" containerName="registry-server" Oct 09 10:51:27 crc kubenswrapper[4740]: I1009 10:51:27.977794 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:27 crc kubenswrapper[4740]: I1009 10:51:27.992045 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-56dpf"] Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.121332 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-utilities\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.121656 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmzr\" (UniqueName: \"kubernetes.io/projected/0ba252c9-2cc5-4914-b41a-08e992039df8-kube-api-access-9wmzr\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.121792 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-catalog-content\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.223303 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-utilities\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.223374 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmzr\" (UniqueName: \"kubernetes.io/projected/0ba252c9-2cc5-4914-b41a-08e992039df8-kube-api-access-9wmzr\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.223493 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-catalog-content\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.224227 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-utilities\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.224301 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-catalog-content\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.245251 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmzr\" (UniqueName: \"kubernetes.io/projected/0ba252c9-2cc5-4914-b41a-08e992039df8-kube-api-access-9wmzr\") pod \"redhat-marketplace-56dpf\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.305216 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:28 crc kubenswrapper[4740]: I1009 10:51:28.775268 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-56dpf"] Oct 09 10:51:29 crc kubenswrapper[4740]: I1009 10:51:29.583182 4740 generic.go:334] "Generic (PLEG): container finished" podID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerID="e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515" exitCode=0 Oct 09 10:51:29 crc kubenswrapper[4740]: I1009 10:51:29.583274 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56dpf" event={"ID":"0ba252c9-2cc5-4914-b41a-08e992039df8","Type":"ContainerDied","Data":"e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515"} Oct 09 10:51:29 crc kubenswrapper[4740]: I1009 10:51:29.583501 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56dpf" event={"ID":"0ba252c9-2cc5-4914-b41a-08e992039df8","Type":"ContainerStarted","Data":"f98772da358dfc364cf618e27b62182ef02a38d06528070b303fda81f47b7bce"} Oct 09 10:51:30 crc kubenswrapper[4740]: I1009 10:51:30.594959 4740 generic.go:334] "Generic (PLEG): container finished" podID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerID="8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991" exitCode=0 Oct 09 10:51:30 crc kubenswrapper[4740]: I1009 10:51:30.595075 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56dpf" event={"ID":"0ba252c9-2cc5-4914-b41a-08e992039df8","Type":"ContainerDied","Data":"8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991"} Oct 09 10:51:31 crc kubenswrapper[4740]: I1009 10:51:31.606189 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56dpf" event={"ID":"0ba252c9-2cc5-4914-b41a-08e992039df8","Type":"ContainerStarted","Data":"cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c"} Oct 09 10:51:31 crc kubenswrapper[4740]: I1009 10:51:31.637079 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-56dpf" podStartSLOduration=2.945210343 podStartE2EDuration="4.637061251s" podCreationTimestamp="2025-10-09 10:51:27 +0000 UTC" firstStartedPulling="2025-10-09 10:51:29.585200494 +0000 UTC m=+1428.547400875" lastFinishedPulling="2025-10-09 10:51:31.277051402 +0000 UTC m=+1430.239251783" observedRunningTime="2025-10-09 10:51:31.627006655 +0000 UTC m=+1430.589207066" watchObservedRunningTime="2025-10-09 10:51:31.637061251 +0000 UTC m=+1430.599261622" Oct 09 10:51:38 crc kubenswrapper[4740]: I1009 10:51:38.305737 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:38 crc kubenswrapper[4740]: I1009 10:51:38.306377 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:38 crc kubenswrapper[4740]: I1009 10:51:38.368001 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:38 crc kubenswrapper[4740]: I1009 10:51:38.714611 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:38 crc kubenswrapper[4740]: I1009 10:51:38.768570 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-56dpf"] Oct 09 10:51:40 crc kubenswrapper[4740]: I1009 10:51:40.694039 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-56dpf" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerName="registry-server" containerID="cri-o://cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c" gracePeriod=2 Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.126028 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.284344 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-utilities\") pod \"0ba252c9-2cc5-4914-b41a-08e992039df8\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.284732 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wmzr\" (UniqueName: \"kubernetes.io/projected/0ba252c9-2cc5-4914-b41a-08e992039df8-kube-api-access-9wmzr\") pod \"0ba252c9-2cc5-4914-b41a-08e992039df8\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.284778 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-catalog-content\") pod \"0ba252c9-2cc5-4914-b41a-08e992039df8\" (UID: \"0ba252c9-2cc5-4914-b41a-08e992039df8\") " Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.285320 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-utilities" (OuterVolumeSpecName: "utilities") pod "0ba252c9-2cc5-4914-b41a-08e992039df8" (UID: "0ba252c9-2cc5-4914-b41a-08e992039df8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.290437 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba252c9-2cc5-4914-b41a-08e992039df8-kube-api-access-9wmzr" (OuterVolumeSpecName: "kube-api-access-9wmzr") pod "0ba252c9-2cc5-4914-b41a-08e992039df8" (UID: "0ba252c9-2cc5-4914-b41a-08e992039df8"). InnerVolumeSpecName "kube-api-access-9wmzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.299523 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ba252c9-2cc5-4914-b41a-08e992039df8" (UID: "0ba252c9-2cc5-4914-b41a-08e992039df8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.387208 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wmzr\" (UniqueName: \"kubernetes.io/projected/0ba252c9-2cc5-4914-b41a-08e992039df8-kube-api-access-9wmzr\") on node \"crc\" DevicePath \"\"" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.387239 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.387251 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ba252c9-2cc5-4914-b41a-08e992039df8-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.705210 4740 generic.go:334] "Generic (PLEG): container finished" podID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerID="cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c" exitCode=0 Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.705284 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56dpf" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.705283 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56dpf" event={"ID":"0ba252c9-2cc5-4914-b41a-08e992039df8","Type":"ContainerDied","Data":"cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c"} Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.705358 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56dpf" event={"ID":"0ba252c9-2cc5-4914-b41a-08e992039df8","Type":"ContainerDied","Data":"f98772da358dfc364cf618e27b62182ef02a38d06528070b303fda81f47b7bce"} Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.705390 4740 scope.go:117] "RemoveContainer" containerID="cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.724285 4740 scope.go:117] "RemoveContainer" containerID="8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.738465 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-56dpf"] Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.746994 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-56dpf"] Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.759459 4740 scope.go:117] "RemoveContainer" containerID="e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.765050 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" path="/var/lib/kubelet/pods/0ba252c9-2cc5-4914-b41a-08e992039df8/volumes" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.811667 4740 scope.go:117] "RemoveContainer" containerID="cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c" Oct 09 10:51:41 crc kubenswrapper[4740]: E1009 10:51:41.812097 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c\": container with ID starting with cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c not found: ID does not exist" containerID="cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.812127 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c"} err="failed to get container status \"cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c\": rpc error: code = NotFound desc = could not find container \"cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c\": container with ID starting with cf688840bb8edb32c6a5c28c3c5ecd9ffbe5fce988acb2c0f88749028bbdcf7c not found: ID does not exist" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.812148 4740 scope.go:117] "RemoveContainer" containerID="8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991" Oct 09 10:51:41 crc kubenswrapper[4740]: E1009 10:51:41.812415 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991\": container with ID starting with 8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991 not found: ID does not exist" containerID="8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.812447 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991"} err="failed to get container status \"8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991\": rpc error: code = NotFound desc = could not find container \"8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991\": container with ID starting with 8c339778fe1ceb5f6c28c3edc01f78626a70d57031be5c6c3f63f8949ca8c991 not found: ID does not exist" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.812468 4740 scope.go:117] "RemoveContainer" containerID="e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515" Oct 09 10:51:41 crc kubenswrapper[4740]: E1009 10:51:41.812929 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515\": container with ID starting with e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515 not found: ID does not exist" containerID="e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515" Oct 09 10:51:41 crc kubenswrapper[4740]: I1009 10:51:41.813047 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515"} err="failed to get container status \"e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515\": rpc error: code = NotFound desc = could not find container \"e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515\": container with ID starting with e966daa1755ee684f94d2bf3d0d78235104b49e555c6b31a553d522d6e0dc515 not found: ID does not exist" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.680289 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cx9vs"] Oct 09 10:51:53 crc kubenswrapper[4740]: E1009 10:51:53.682403 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerName="extract-utilities" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.682508 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerName="extract-utilities" Oct 09 10:51:53 crc kubenswrapper[4740]: E1009 10:51:53.682598 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerName="extract-content" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.682736 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerName="extract-content" Oct 09 10:51:53 crc kubenswrapper[4740]: E1009 10:51:53.682916 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerName="registry-server" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.683028 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerName="registry-server" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.683446 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba252c9-2cc5-4914-b41a-08e992039df8" containerName="registry-server" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.685923 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.694988 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cx9vs"] Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.836219 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-catalog-content\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.836336 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-utilities\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.836388 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mrcn\" (UniqueName: \"kubernetes.io/projected/406056d4-d37f-4305-a315-9a7e64218b09-kube-api-access-9mrcn\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.938076 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mrcn\" (UniqueName: \"kubernetes.io/projected/406056d4-d37f-4305-a315-9a7e64218b09-kube-api-access-9mrcn\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.938262 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-catalog-content\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.938326 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-utilities\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.938859 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-utilities\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.938883 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-catalog-content\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:53 crc kubenswrapper[4740]: I1009 10:51:53.962469 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mrcn\" (UniqueName: \"kubernetes.io/projected/406056d4-d37f-4305-a315-9a7e64218b09-kube-api-access-9mrcn\") pod \"community-operators-cx9vs\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:54 crc kubenswrapper[4740]: I1009 10:51:54.003853 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:51:54 crc kubenswrapper[4740]: I1009 10:51:54.509763 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cx9vs"] Oct 09 10:51:54 crc kubenswrapper[4740]: W1009 10:51:54.515967 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406056d4_d37f_4305_a315_9a7e64218b09.slice/crio-c3ee9f24a58fe936e46044e78d10e8b2abcc2461015f193ccf0e22113ae38b06 WatchSource:0}: Error finding container c3ee9f24a58fe936e46044e78d10e8b2abcc2461015f193ccf0e22113ae38b06: Status 404 returned error can't find the container with id c3ee9f24a58fe936e46044e78d10e8b2abcc2461015f193ccf0e22113ae38b06 Oct 09 10:51:54 crc kubenswrapper[4740]: I1009 10:51:54.840588 4740 generic.go:334] "Generic (PLEG): container finished" podID="406056d4-d37f-4305-a315-9a7e64218b09" containerID="9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8" exitCode=0 Oct 09 10:51:54 crc kubenswrapper[4740]: I1009 10:51:54.840660 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx9vs" event={"ID":"406056d4-d37f-4305-a315-9a7e64218b09","Type":"ContainerDied","Data":"9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8"} Oct 09 10:51:54 crc kubenswrapper[4740]: I1009 10:51:54.840991 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx9vs" event={"ID":"406056d4-d37f-4305-a315-9a7e64218b09","Type":"ContainerStarted","Data":"c3ee9f24a58fe936e46044e78d10e8b2abcc2461015f193ccf0e22113ae38b06"} Oct 09 10:51:55 crc kubenswrapper[4740]: I1009 10:51:55.880002 4740 generic.go:334] "Generic (PLEG): container finished" podID="406056d4-d37f-4305-a315-9a7e64218b09" containerID="83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b" exitCode=0 Oct 09 10:51:55 crc kubenswrapper[4740]: I1009 10:51:55.880341 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx9vs" event={"ID":"406056d4-d37f-4305-a315-9a7e64218b09","Type":"ContainerDied","Data":"83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b"} Oct 09 10:51:56 crc kubenswrapper[4740]: I1009 10:51:56.895130 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx9vs" event={"ID":"406056d4-d37f-4305-a315-9a7e64218b09","Type":"ContainerStarted","Data":"f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f"} Oct 09 10:51:56 crc kubenswrapper[4740]: I1009 10:51:56.918976 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cx9vs" podStartSLOduration=2.28319964 podStartE2EDuration="3.918956482s" podCreationTimestamp="2025-10-09 10:51:53 +0000 UTC" firstStartedPulling="2025-10-09 10:51:54.84426801 +0000 UTC m=+1453.806468401" lastFinishedPulling="2025-10-09 10:51:56.480024852 +0000 UTC m=+1455.442225243" observedRunningTime="2025-10-09 10:51:56.911227037 +0000 UTC m=+1455.873427428" watchObservedRunningTime="2025-10-09 10:51:56.918956482 +0000 UTC m=+1455.881156863" Oct 09 10:52:04 crc kubenswrapper[4740]: I1009 10:52:04.003935 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:52:04 crc kubenswrapper[4740]: I1009 10:52:04.004696 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:52:04 crc kubenswrapper[4740]: I1009 10:52:04.091874 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:52:05 crc kubenswrapper[4740]: I1009 10:52:05.048230 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:52:05 crc kubenswrapper[4740]: I1009 10:52:05.100857 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cx9vs"] Oct 09 10:52:05 crc kubenswrapper[4740]: I1009 10:52:05.408131 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:52:05 crc kubenswrapper[4740]: I1009 10:52:05.408217 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.000343 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cx9vs" podUID="406056d4-d37f-4305-a315-9a7e64218b09" containerName="registry-server" containerID="cri-o://f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f" gracePeriod=2 Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.429285 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.507279 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-utilities\") pod \"406056d4-d37f-4305-a315-9a7e64218b09\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.507366 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-catalog-content\") pod \"406056d4-d37f-4305-a315-9a7e64218b09\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.507484 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mrcn\" (UniqueName: \"kubernetes.io/projected/406056d4-d37f-4305-a315-9a7e64218b09-kube-api-access-9mrcn\") pod \"406056d4-d37f-4305-a315-9a7e64218b09\" (UID: \"406056d4-d37f-4305-a315-9a7e64218b09\") " Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.513517 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406056d4-d37f-4305-a315-9a7e64218b09-kube-api-access-9mrcn" (OuterVolumeSpecName: "kube-api-access-9mrcn") pod "406056d4-d37f-4305-a315-9a7e64218b09" (UID: "406056d4-d37f-4305-a315-9a7e64218b09"). InnerVolumeSpecName "kube-api-access-9mrcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.514165 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-utilities" (OuterVolumeSpecName: "utilities") pod "406056d4-d37f-4305-a315-9a7e64218b09" (UID: "406056d4-d37f-4305-a315-9a7e64218b09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.565259 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "406056d4-d37f-4305-a315-9a7e64218b09" (UID: "406056d4-d37f-4305-a315-9a7e64218b09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.609616 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.609671 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406056d4-d37f-4305-a315-9a7e64218b09-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:07 crc kubenswrapper[4740]: I1009 10:52:07.609682 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mrcn\" (UniqueName: \"kubernetes.io/projected/406056d4-d37f-4305-a315-9a7e64218b09-kube-api-access-9mrcn\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.012923 4740 generic.go:334] "Generic (PLEG): container finished" podID="406056d4-d37f-4305-a315-9a7e64218b09" containerID="f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f" exitCode=0 Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.012976 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx9vs" event={"ID":"406056d4-d37f-4305-a315-9a7e64218b09","Type":"ContainerDied","Data":"f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f"} Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.012984 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx9vs" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.013010 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx9vs" event={"ID":"406056d4-d37f-4305-a315-9a7e64218b09","Type":"ContainerDied","Data":"c3ee9f24a58fe936e46044e78d10e8b2abcc2461015f193ccf0e22113ae38b06"} Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.013031 4740 scope.go:117] "RemoveContainer" containerID="f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.042286 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cx9vs"] Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.047287 4740 scope.go:117] "RemoveContainer" containerID="83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.055711 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cx9vs"] Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.068520 4740 scope.go:117] "RemoveContainer" containerID="9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.124874 4740 scope.go:117] "RemoveContainer" containerID="f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f" Oct 09 10:52:08 crc kubenswrapper[4740]: E1009 10:52:08.125473 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f\": container with ID starting with f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f not found: ID does not exist" containerID="f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.125530 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f"} err="failed to get container status \"f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f\": rpc error: code = NotFound desc = could not find container \"f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f\": container with ID starting with f659cc639ded6ad3009d27e5f15ddd767de535dfb67af1b54b9292c1c93d810f not found: ID does not exist" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.125559 4740 scope.go:117] "RemoveContainer" containerID="83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b" Oct 09 10:52:08 crc kubenswrapper[4740]: E1009 10:52:08.126115 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b\": container with ID starting with 83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b not found: ID does not exist" containerID="83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.126242 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b"} err="failed to get container status \"83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b\": rpc error: code = NotFound desc = could not find container \"83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b\": container with ID starting with 83e190e56027e92129de8e6ac02d5faa51e72c11aa0aa501c7b4212e448fab0b not found: ID does not exist" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.126277 4740 scope.go:117] "RemoveContainer" containerID="9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8" Oct 09 10:52:08 crc kubenswrapper[4740]: E1009 10:52:08.126718 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8\": container with ID starting with 9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8 not found: ID does not exist" containerID="9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8" Oct 09 10:52:08 crc kubenswrapper[4740]: I1009 10:52:08.126914 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8"} err="failed to get container status \"9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8\": rpc error: code = NotFound desc = could not find container \"9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8\": container with ID starting with 9b855d20c54b44b526f785cd56d368658545b6725fac9d4f243e7e9aefe1b4f8 not found: ID does not exist" Oct 09 10:52:09 crc kubenswrapper[4740]: I1009 10:52:09.770961 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="406056d4-d37f-4305-a315-9a7e64218b09" path="/var/lib/kubelet/pods/406056d4-d37f-4305-a315-9a7e64218b09/volumes" Oct 09 10:52:10 crc kubenswrapper[4740]: I1009 10:52:10.057174 4740 scope.go:117] "RemoveContainer" containerID="69c62d42a35a352f169fbf0778bd61ee3bbb732992914b48d25dd7792b562edd" Oct 09 10:52:10 crc kubenswrapper[4740]: I1009 10:52:10.101591 4740 scope.go:117] "RemoveContainer" containerID="a98c21eed4d0e1bdbab9c4f4a2d8a3a69aaf2528ff6c4a613504a7a0684f43f2" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.149266 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mz9pq"] Oct 09 10:52:31 crc kubenswrapper[4740]: E1009 10:52:31.150183 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406056d4-d37f-4305-a315-9a7e64218b09" containerName="extract-utilities" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.150195 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="406056d4-d37f-4305-a315-9a7e64218b09" containerName="extract-utilities" Oct 09 10:52:31 crc kubenswrapper[4740]: E1009 10:52:31.150212 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406056d4-d37f-4305-a315-9a7e64218b09" containerName="registry-server" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.150218 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="406056d4-d37f-4305-a315-9a7e64218b09" containerName="registry-server" Oct 09 10:52:31 crc kubenswrapper[4740]: E1009 10:52:31.150243 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406056d4-d37f-4305-a315-9a7e64218b09" containerName="extract-content" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.150250 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="406056d4-d37f-4305-a315-9a7e64218b09" containerName="extract-content" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.150415 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="406056d4-d37f-4305-a315-9a7e64218b09" containerName="registry-server" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.152160 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.167737 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mz9pq"] Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.274720 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m79r2\" (UniqueName: \"kubernetes.io/projected/7e490083-3e13-4528-b489-8a7555abb9be-kube-api-access-m79r2\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.275003 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e490083-3e13-4528-b489-8a7555abb9be-utilities\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.275256 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e490083-3e13-4528-b489-8a7555abb9be-catalog-content\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.376864 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m79r2\" (UniqueName: \"kubernetes.io/projected/7e490083-3e13-4528-b489-8a7555abb9be-kube-api-access-m79r2\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.377367 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e490083-3e13-4528-b489-8a7555abb9be-utilities\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.377478 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e490083-3e13-4528-b489-8a7555abb9be-catalog-content\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.377920 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e490083-3e13-4528-b489-8a7555abb9be-utilities\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.378006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e490083-3e13-4528-b489-8a7555abb9be-catalog-content\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.397639 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m79r2\" (UniqueName: \"kubernetes.io/projected/7e490083-3e13-4528-b489-8a7555abb9be-kube-api-access-m79r2\") pod \"certified-operators-mz9pq\" (UID: \"7e490083-3e13-4528-b489-8a7555abb9be\") " pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.472782 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:31 crc kubenswrapper[4740]: I1009 10:52:31.958981 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mz9pq"] Oct 09 10:52:32 crc kubenswrapper[4740]: I1009 10:52:32.276335 4740 generic.go:334] "Generic (PLEG): container finished" podID="7e490083-3e13-4528-b489-8a7555abb9be" containerID="489cdadd52c2c4f171f244b5696dc1d677ec234f11054714ba7632aa095ef808" exitCode=0 Oct 09 10:52:32 crc kubenswrapper[4740]: I1009 10:52:32.276530 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz9pq" event={"ID":"7e490083-3e13-4528-b489-8a7555abb9be","Type":"ContainerDied","Data":"489cdadd52c2c4f171f244b5696dc1d677ec234f11054714ba7632aa095ef808"} Oct 09 10:52:32 crc kubenswrapper[4740]: I1009 10:52:32.276699 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz9pq" event={"ID":"7e490083-3e13-4528-b489-8a7555abb9be","Type":"ContainerStarted","Data":"3d81f2e1497cd135f447e725a87e44c17caca847bf170d43002b89e5be6cddf9"} Oct 09 10:52:35 crc kubenswrapper[4740]: I1009 10:52:35.408075 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:52:35 crc kubenswrapper[4740]: I1009 10:52:35.408706 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:52:36 crc kubenswrapper[4740]: I1009 10:52:36.318410 4740 generic.go:334] "Generic (PLEG): container finished" podID="7e490083-3e13-4528-b489-8a7555abb9be" containerID="ecca796c364fdc22b802f9985e9c6617c5dd1fcd4b9126948260458e8f8a6a53" exitCode=0 Oct 09 10:52:36 crc kubenswrapper[4740]: I1009 10:52:36.318646 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz9pq" event={"ID":"7e490083-3e13-4528-b489-8a7555abb9be","Type":"ContainerDied","Data":"ecca796c364fdc22b802f9985e9c6617c5dd1fcd4b9126948260458e8f8a6a53"} Oct 09 10:52:37 crc kubenswrapper[4740]: I1009 10:52:37.328913 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz9pq" event={"ID":"7e490083-3e13-4528-b489-8a7555abb9be","Type":"ContainerStarted","Data":"5fd44b25f6a4a607ee95e129038c0467aa3e04313ece077ac03bfb7658cf67af"} Oct 09 10:52:37 crc kubenswrapper[4740]: I1009 10:52:37.354370 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mz9pq" podStartSLOduration=1.8552136080000001 podStartE2EDuration="6.354349289s" podCreationTimestamp="2025-10-09 10:52:31 +0000 UTC" firstStartedPulling="2025-10-09 10:52:32.278154438 +0000 UTC m=+1491.240354829" lastFinishedPulling="2025-10-09 10:52:36.777290129 +0000 UTC m=+1495.739490510" observedRunningTime="2025-10-09 10:52:37.347328403 +0000 UTC m=+1496.309528784" watchObservedRunningTime="2025-10-09 10:52:37.354349289 +0000 UTC m=+1496.316549670" Oct 09 10:52:41 crc kubenswrapper[4740]: I1009 10:52:41.473639 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:41 crc kubenswrapper[4740]: I1009 10:52:41.474129 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:41 crc kubenswrapper[4740]: I1009 10:52:41.515142 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:42 crc kubenswrapper[4740]: I1009 10:52:42.417810 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mz9pq" Oct 09 10:52:42 crc kubenswrapper[4740]: I1009 10:52:42.484002 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mz9pq"] Oct 09 10:52:42 crc kubenswrapper[4740]: I1009 10:52:42.530464 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4t5tf"] Oct 09 10:52:42 crc kubenswrapper[4740]: I1009 10:52:42.530690 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4t5tf" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerName="registry-server" containerID="cri-o://05463da2ea0d184b0d4824621f1f12a6e105d6172a57519f5bc2daab52512951" gracePeriod=2 Oct 09 10:52:43 crc kubenswrapper[4740]: I1009 10:52:43.378107 4740 generic.go:334] "Generic (PLEG): container finished" podID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerID="05463da2ea0d184b0d4824621f1f12a6e105d6172a57519f5bc2daab52512951" exitCode=0 Oct 09 10:52:43 crc kubenswrapper[4740]: I1009 10:52:43.378180 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t5tf" event={"ID":"726d1cc8-a024-4ec8-a89c-bb7018c6b82f","Type":"ContainerDied","Data":"05463da2ea0d184b0d4824621f1f12a6e105d6172a57519f5bc2daab52512951"} Oct 09 10:52:43 crc kubenswrapper[4740]: I1009 10:52:43.931200 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.039285 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-utilities\") pod \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.039407 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-catalog-content\") pod \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.039468 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq8j5\" (UniqueName: \"kubernetes.io/projected/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-kube-api-access-sq8j5\") pod \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\" (UID: \"726d1cc8-a024-4ec8-a89c-bb7018c6b82f\") " Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.232347 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-utilities" (OuterVolumeSpecName: "utilities") pod "726d1cc8-a024-4ec8-a89c-bb7018c6b82f" (UID: "726d1cc8-a024-4ec8-a89c-bb7018c6b82f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.238242 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-kube-api-access-sq8j5" (OuterVolumeSpecName: "kube-api-access-sq8j5") pod "726d1cc8-a024-4ec8-a89c-bb7018c6b82f" (UID: "726d1cc8-a024-4ec8-a89c-bb7018c6b82f"). InnerVolumeSpecName "kube-api-access-sq8j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.244598 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq8j5\" (UniqueName: \"kubernetes.io/projected/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-kube-api-access-sq8j5\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.244634 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.344155 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "726d1cc8-a024-4ec8-a89c-bb7018c6b82f" (UID: "726d1cc8-a024-4ec8-a89c-bb7018c6b82f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.346976 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726d1cc8-a024-4ec8-a89c-bb7018c6b82f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.390504 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t5tf" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.391879 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t5tf" event={"ID":"726d1cc8-a024-4ec8-a89c-bb7018c6b82f","Type":"ContainerDied","Data":"0f56ab2a782597668b30200f1d53285b7fca5d924cef6ccb726798b617c0fdb5"} Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.391936 4740 scope.go:117] "RemoveContainer" containerID="05463da2ea0d184b0d4824621f1f12a6e105d6172a57519f5bc2daab52512951" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.426041 4740 scope.go:117] "RemoveContainer" containerID="70c61df985f2d051a2a3b1d4ac76854f7a2a3a6db1772a2f7781fc1241a52421" Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.430159 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4t5tf"] Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.437898 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4t5tf"] Oct 09 10:52:44 crc kubenswrapper[4740]: I1009 10:52:44.655381 4740 scope.go:117] "RemoveContainer" containerID="abc09b558e57be4ccedabd392fd19b3f81966596d978ab9a0be29576be0252a9" Oct 09 10:52:45 crc kubenswrapper[4740]: I1009 10:52:45.765632 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" path="/var/lib/kubelet/pods/726d1cc8-a024-4ec8-a89c-bb7018c6b82f/volumes" Oct 09 10:52:55 crc kubenswrapper[4740]: I1009 10:52:55.504563 4740 generic.go:334] "Generic (PLEG): container finished" podID="3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8" containerID="0dc29f55d35cf6181b6d47400aabd409e5ae774686a78de844b2289800963038" exitCode=0 Oct 09 10:52:55 crc kubenswrapper[4740]: I1009 10:52:55.504659 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" event={"ID":"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8","Type":"ContainerDied","Data":"0dc29f55d35cf6181b6d47400aabd409e5ae774686a78de844b2289800963038"} Oct 09 10:52:56 crc kubenswrapper[4740]: I1009 10:52:56.984604 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.075570 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-bootstrap-combined-ca-bundle\") pod \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.075732 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-inventory\") pod \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.075851 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-ssh-key\") pod \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.076014 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnnkk\" (UniqueName: \"kubernetes.io/projected/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-kube-api-access-fnnkk\") pod \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\" (UID: \"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8\") " Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.080703 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8" (UID: "3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.081221 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-kube-api-access-fnnkk" (OuterVolumeSpecName: "kube-api-access-fnnkk") pod "3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8" (UID: "3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8"). InnerVolumeSpecName "kube-api-access-fnnkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.108927 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-inventory" (OuterVolumeSpecName: "inventory") pod "3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8" (UID: "3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.112041 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8" (UID: "3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.178215 4740 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.178264 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.178276 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.178289 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnnkk\" (UniqueName: \"kubernetes.io/projected/3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8-kube-api-access-fnnkk\") on node \"crc\" DevicePath \"\"" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.531449 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" event={"ID":"3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8","Type":"ContainerDied","Data":"b75ca866ae868ce172fc895b62f28fafbc230f9ed216c4720438a1712ab02beb"} Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.532220 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b75ca866ae868ce172fc895b62f28fafbc230f9ed216c4720438a1712ab02beb" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.531694 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.614687 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz"] Oct 09 10:52:57 crc kubenswrapper[4740]: E1009 10:52:57.615533 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerName="registry-server" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.615557 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerName="registry-server" Oct 09 10:52:57 crc kubenswrapper[4740]: E1009 10:52:57.615572 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.615582 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 10:52:57 crc kubenswrapper[4740]: E1009 10:52:57.615602 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerName="extract-content" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.615610 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerName="extract-content" Oct 09 10:52:57 crc kubenswrapper[4740]: E1009 10:52:57.615641 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerName="extract-utilities" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.615648 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerName="extract-utilities" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.615907 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.615938 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="726d1cc8-a024-4ec8-a89c-bb7018c6b82f" containerName="registry-server" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.616711 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.619323 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.619548 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.619613 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.619879 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.636656 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz"] Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.687020 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.687159 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.687262 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj66d\" (UniqueName: \"kubernetes.io/projected/7ced3562-d429-4443-9aa2-82901f4f7797-kube-api-access-xj66d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.790041 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.790201 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj66d\" (UniqueName: \"kubernetes.io/projected/7ced3562-d429-4443-9aa2-82901f4f7797-kube-api-access-xj66d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.790239 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.793742 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.796524 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.805640 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj66d\" (UniqueName: \"kubernetes.io/projected/7ced3562-d429-4443-9aa2-82901f4f7797-kube-api-access-xj66d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s79wz\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:57 crc kubenswrapper[4740]: I1009 10:52:57.946647 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:52:58 crc kubenswrapper[4740]: I1009 10:52:58.532168 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz"] Oct 09 10:52:58 crc kubenswrapper[4740]: W1009 10:52:58.535144 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ced3562_d429_4443_9aa2_82901f4f7797.slice/crio-0b33018a4fb50e42cbd3c1204889d8cb2ed122304cfe02dd2bdb609c7b9952c0 WatchSource:0}: Error finding container 0b33018a4fb50e42cbd3c1204889d8cb2ed122304cfe02dd2bdb609c7b9952c0: Status 404 returned error can't find the container with id 0b33018a4fb50e42cbd3c1204889d8cb2ed122304cfe02dd2bdb609c7b9952c0 Oct 09 10:52:59 crc kubenswrapper[4740]: I1009 10:52:59.553035 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" event={"ID":"7ced3562-d429-4443-9aa2-82901f4f7797","Type":"ContainerStarted","Data":"cc86cd7026990c0df03c8686defab374ceb930153138876374f3a81aa4646655"} Oct 09 10:52:59 crc kubenswrapper[4740]: I1009 10:52:59.553116 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" event={"ID":"7ced3562-d429-4443-9aa2-82901f4f7797","Type":"ContainerStarted","Data":"0b33018a4fb50e42cbd3c1204889d8cb2ed122304cfe02dd2bdb609c7b9952c0"} Oct 09 10:52:59 crc kubenswrapper[4740]: I1009 10:52:59.579187 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" podStartSLOduration=2.177157675 podStartE2EDuration="2.579168357s" podCreationTimestamp="2025-10-09 10:52:57 +0000 UTC" firstStartedPulling="2025-10-09 10:52:58.537433215 +0000 UTC m=+1517.499633596" lastFinishedPulling="2025-10-09 10:52:58.939443897 +0000 UTC m=+1517.901644278" observedRunningTime="2025-10-09 10:52:59.566688156 +0000 UTC m=+1518.528888537" watchObservedRunningTime="2025-10-09 10:52:59.579168357 +0000 UTC m=+1518.541368748" Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.407568 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.408494 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.408575 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.409589 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.409674 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" gracePeriod=600 Oct 09 10:53:05 crc kubenswrapper[4740]: E1009 10:53:05.535881 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.612252 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" exitCode=0 Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.612290 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562"} Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.612333 4740 scope.go:117] "RemoveContainer" containerID="198944336c8d712e7c21457778dd2b3f352b6a523b8c1f1ec0b48f4c6d926ff3" Oct 09 10:53:05 crc kubenswrapper[4740]: I1009 10:53:05.612950 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:53:05 crc kubenswrapper[4740]: E1009 10:53:05.613182 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:53:10 crc kubenswrapper[4740]: I1009 10:53:10.186899 4740 scope.go:117] "RemoveContainer" containerID="eff4fe15174ef6aec62b353465763dde306566a9255148808771642fdd0c4772" Oct 09 10:53:10 crc kubenswrapper[4740]: I1009 10:53:10.214490 4740 scope.go:117] "RemoveContainer" containerID="6dd8b68ec2e4d8e9395e33995271870ffc141bc52635d64b60da17c2974a8b2c" Oct 09 10:53:16 crc kubenswrapper[4740]: I1009 10:53:16.755216 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:53:16 crc kubenswrapper[4740]: E1009 10:53:16.755957 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:53:30 crc kubenswrapper[4740]: I1009 10:53:30.754059 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:53:30 crc kubenswrapper[4740]: E1009 10:53:30.755053 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:53:41 crc kubenswrapper[4740]: I1009 10:53:41.760926 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:53:41 crc kubenswrapper[4740]: E1009 10:53:41.761581 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:53:54 crc kubenswrapper[4740]: I1009 10:53:54.754321 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:53:54 crc kubenswrapper[4740]: E1009 10:53:54.755432 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:53:55 crc kubenswrapper[4740]: I1009 10:53:55.042701 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-77xsq"] Oct 09 10:53:55 crc kubenswrapper[4740]: I1009 10:53:55.052398 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-77xsq"] Oct 09 10:53:55 crc kubenswrapper[4740]: I1009 10:53:55.769170 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb042060-39cf-4d73-be53-aa20360e48f1" path="/var/lib/kubelet/pods/cb042060-39cf-4d73-be53-aa20360e48f1/volumes" Oct 09 10:53:59 crc kubenswrapper[4740]: I1009 10:53:59.034302 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-clpxk"] Oct 09 10:53:59 crc kubenswrapper[4740]: I1009 10:53:59.045818 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-clpxk"] Oct 09 10:53:59 crc kubenswrapper[4740]: I1009 10:53:59.768064 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34de7ce0-3d21-4ea0-9cbb-377ae365f423" path="/var/lib/kubelet/pods/34de7ce0-3d21-4ea0-9cbb-377ae365f423/volumes" Oct 09 10:54:00 crc kubenswrapper[4740]: I1009 10:54:00.055665 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-f7twr"] Oct 09 10:54:00 crc kubenswrapper[4740]: I1009 10:54:00.069241 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-f7twr"] Oct 09 10:54:01 crc kubenswrapper[4740]: I1009 10:54:01.765262 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50c526e9-512f-4541-9c40-dbe246d4afa9" path="/var/lib/kubelet/pods/50c526e9-512f-4541-9c40-dbe246d4afa9/volumes" Oct 09 10:54:05 crc kubenswrapper[4740]: I1009 10:54:05.027983 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-fc7c-account-create-7hr86"] Oct 09 10:54:05 crc kubenswrapper[4740]: I1009 10:54:05.039738 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-fc7c-account-create-7hr86"] Oct 09 10:54:05 crc kubenswrapper[4740]: I1009 10:54:05.766644 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c063f94-2867-463a-a5e7-a436d38ebb1d" path="/var/lib/kubelet/pods/1c063f94-2867-463a-a5e7-a436d38ebb1d/volumes" Oct 09 10:54:08 crc kubenswrapper[4740]: I1009 10:54:08.754163 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:54:08 crc kubenswrapper[4740]: E1009 10:54:08.754512 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.032646 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0fdb-account-create-7jl66"] Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.042626 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d4a2-account-create-kj8v8"] Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.049998 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d4a2-account-create-kj8v8"] Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.057693 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0fdb-account-create-7jl66"] Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.300509 4740 scope.go:117] "RemoveContainer" containerID="fc0fac6a7dd1576484c374447515d2b754afe9e28ec468a90f186817d13235a4" Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.327527 4740 scope.go:117] "RemoveContainer" containerID="68c121cb1b9b5d61067946f21aaa382efbedb856bac38ee5f18dbfdb60ffc68a" Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.382397 4740 scope.go:117] "RemoveContainer" containerID="4bc58709a36d78e199b1fee4e2c77ebd1fe522c0a69fa6c7acfb3d8f5694a177" Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.423484 4740 scope.go:117] "RemoveContainer" containerID="17c61b6deb7c03c82d02128991d6f3d6e679b8c28d2c7fea29077b59210c0d8f" Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.465996 4740 scope.go:117] "RemoveContainer" containerID="6b41f3337bebd89d9b228c81cb1dfb6df151ad00f2a6e62c895aa024d1415b04" Oct 09 10:54:10 crc kubenswrapper[4740]: I1009 10:54:10.526659 4740 scope.go:117] "RemoveContainer" containerID="250a88eff5719440e2f3afca5e0f8af2fb1e6743e20cec7c527db6d28f462c40" Oct 09 10:54:11 crc kubenswrapper[4740]: I1009 10:54:11.766009 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de7a924-745d-40e0-8358-271bb7034f87" path="/var/lib/kubelet/pods/1de7a924-745d-40e0-8358-271bb7034f87/volumes" Oct 09 10:54:11 crc kubenswrapper[4740]: I1009 10:54:11.768355 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d7f29f-de12-4aed-95ee-47b80666098b" path="/var/lib/kubelet/pods/82d7f29f-de12-4aed-95ee-47b80666098b/volumes" Oct 09 10:54:20 crc kubenswrapper[4740]: I1009 10:54:20.754261 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:54:20 crc kubenswrapper[4740]: E1009 10:54:20.755659 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:54:26 crc kubenswrapper[4740]: I1009 10:54:26.026946 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-r8w7m"] Oct 09 10:54:26 crc kubenswrapper[4740]: I1009 10:54:26.035376 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8wxbl"] Oct 09 10:54:26 crc kubenswrapper[4740]: I1009 10:54:26.045731 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8wxbl"] Oct 09 10:54:26 crc kubenswrapper[4740]: I1009 10:54:26.054430 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-r8w7m"] Oct 09 10:54:27 crc kubenswrapper[4740]: I1009 10:54:27.763234 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd4630f-ae0a-422a-9a31-a5b833aa9f79" path="/var/lib/kubelet/pods/1cd4630f-ae0a-422a-9a31-a5b833aa9f79/volumes" Oct 09 10:54:27 crc kubenswrapper[4740]: I1009 10:54:27.764010 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7618ac8f-2d0b-49da-943b-13dd939652d0" path="/var/lib/kubelet/pods/7618ac8f-2d0b-49da-943b-13dd939652d0/volumes" Oct 09 10:54:28 crc kubenswrapper[4740]: I1009 10:54:28.420340 4740 generic.go:334] "Generic (PLEG): container finished" podID="7ced3562-d429-4443-9aa2-82901f4f7797" containerID="cc86cd7026990c0df03c8686defab374ceb930153138876374f3a81aa4646655" exitCode=0 Oct 09 10:54:28 crc kubenswrapper[4740]: I1009 10:54:28.420391 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" event={"ID":"7ced3562-d429-4443-9aa2-82901f4f7797","Type":"ContainerDied","Data":"cc86cd7026990c0df03c8686defab374ceb930153138876374f3a81aa4646655"} Oct 09 10:54:29 crc kubenswrapper[4740]: I1009 10:54:29.032170 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-btzp2"] Oct 09 10:54:29 crc kubenswrapper[4740]: I1009 10:54:29.039449 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-btzp2"] Oct 09 10:54:29 crc kubenswrapper[4740]: I1009 10:54:29.764608 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ce61e7-f832-4491-a80f-0f0bc24d15cd" path="/var/lib/kubelet/pods/93ce61e7-f832-4491-a80f-0f0bc24d15cd/volumes" Oct 09 10:54:29 crc kubenswrapper[4740]: I1009 10:54:29.918425 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.038320 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-ssh-key\") pod \"7ced3562-d429-4443-9aa2-82901f4f7797\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.038456 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-inventory\") pod \"7ced3562-d429-4443-9aa2-82901f4f7797\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.038582 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj66d\" (UniqueName: \"kubernetes.io/projected/7ced3562-d429-4443-9aa2-82901f4f7797-kube-api-access-xj66d\") pod \"7ced3562-d429-4443-9aa2-82901f4f7797\" (UID: \"7ced3562-d429-4443-9aa2-82901f4f7797\") " Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.050305 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ced3562-d429-4443-9aa2-82901f4f7797-kube-api-access-xj66d" (OuterVolumeSpecName: "kube-api-access-xj66d") pod "7ced3562-d429-4443-9aa2-82901f4f7797" (UID: "7ced3562-d429-4443-9aa2-82901f4f7797"). InnerVolumeSpecName "kube-api-access-xj66d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.050700 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-cpnq8"] Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.059987 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-cpnq8"] Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.075172 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-inventory" (OuterVolumeSpecName: "inventory") pod "7ced3562-d429-4443-9aa2-82901f4f7797" (UID: "7ced3562-d429-4443-9aa2-82901f4f7797"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.091594 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ced3562-d429-4443-9aa2-82901f4f7797" (UID: "7ced3562-d429-4443-9aa2-82901f4f7797"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.141356 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj66d\" (UniqueName: \"kubernetes.io/projected/7ced3562-d429-4443-9aa2-82901f4f7797-kube-api-access-xj66d\") on node \"crc\" DevicePath \"\"" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.141407 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.141427 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ced3562-d429-4443-9aa2-82901f4f7797-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.448889 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" event={"ID":"7ced3562-d429-4443-9aa2-82901f4f7797","Type":"ContainerDied","Data":"0b33018a4fb50e42cbd3c1204889d8cb2ed122304cfe02dd2bdb609c7b9952c0"} Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.448954 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b33018a4fb50e42cbd3c1204889d8cb2ed122304cfe02dd2bdb609c7b9952c0" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.448998 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s79wz" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.540541 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5"] Oct 09 10:54:30 crc kubenswrapper[4740]: E1009 10:54:30.540922 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ced3562-d429-4443-9aa2-82901f4f7797" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.540941 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ced3562-d429-4443-9aa2-82901f4f7797" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.541143 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ced3562-d429-4443-9aa2-82901f4f7797" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.542161 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.545133 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.545211 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.545263 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.545585 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.556441 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5"] Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.651515 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.651623 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hdv\" (UniqueName: \"kubernetes.io/projected/091c1607-1916-4dfd-9e3d-95dbe5534e98-kube-api-access-c5hdv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.651875 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.753372 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.753451 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hdv\" (UniqueName: \"kubernetes.io/projected/091c1607-1916-4dfd-9e3d-95dbe5534e98-kube-api-access-c5hdv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.753509 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.757238 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.758562 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.779465 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hdv\" (UniqueName: \"kubernetes.io/projected/091c1607-1916-4dfd-9e3d-95dbe5534e98-kube-api-access-c5hdv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wndn5\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:30 crc kubenswrapper[4740]: I1009 10:54:30.872013 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:54:31 crc kubenswrapper[4740]: I1009 10:54:31.401335 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5"] Oct 09 10:54:31 crc kubenswrapper[4740]: W1009 10:54:31.402161 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod091c1607_1916_4dfd_9e3d_95dbe5534e98.slice/crio-8b5909345ca6fcb4ae10464420869c177321d14aabb42fc4c140d2be9dac19a0 WatchSource:0}: Error finding container 8b5909345ca6fcb4ae10464420869c177321d14aabb42fc4c140d2be9dac19a0: Status 404 returned error can't find the container with id 8b5909345ca6fcb4ae10464420869c177321d14aabb42fc4c140d2be9dac19a0 Oct 09 10:54:31 crc kubenswrapper[4740]: I1009 10:54:31.405490 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 10:54:31 crc kubenswrapper[4740]: I1009 10:54:31.458004 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" event={"ID":"091c1607-1916-4dfd-9e3d-95dbe5534e98","Type":"ContainerStarted","Data":"8b5909345ca6fcb4ae10464420869c177321d14aabb42fc4c140d2be9dac19a0"} Oct 09 10:54:31 crc kubenswrapper[4740]: I1009 10:54:31.789770 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c004cb4-8052-425c-ac2e-11159a708cad" path="/var/lib/kubelet/pods/9c004cb4-8052-425c-ac2e-11159a708cad/volumes" Oct 09 10:54:32 crc kubenswrapper[4740]: I1009 10:54:32.467794 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" event={"ID":"091c1607-1916-4dfd-9e3d-95dbe5534e98","Type":"ContainerStarted","Data":"7f3dadff1b072bf9cd5fa1edf4ab6f8b3db378fd325386594df40e22cab0a3a8"} Oct 09 10:54:32 crc kubenswrapper[4740]: I1009 10:54:32.485245 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" podStartSLOduration=2.036438362 podStartE2EDuration="2.485218924s" podCreationTimestamp="2025-10-09 10:54:30 +0000 UTC" firstStartedPulling="2025-10-09 10:54:31.405252728 +0000 UTC m=+1610.367453109" lastFinishedPulling="2025-10-09 10:54:31.85403329 +0000 UTC m=+1610.816233671" observedRunningTime="2025-10-09 10:54:32.481353591 +0000 UTC m=+1611.443553972" watchObservedRunningTime="2025-10-09 10:54:32.485218924 +0000 UTC m=+1611.447419345" Oct 09 10:54:34 crc kubenswrapper[4740]: I1009 10:54:34.035010 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kh7ft"] Oct 09 10:54:34 crc kubenswrapper[4740]: I1009 10:54:34.042262 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kh7ft"] Oct 09 10:54:34 crc kubenswrapper[4740]: I1009 10:54:34.753545 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:54:34 crc kubenswrapper[4740]: E1009 10:54:34.754134 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:54:35 crc kubenswrapper[4740]: I1009 10:54:35.774816 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365ec886-8ed3-4c64-a794-502bfef4fedf" path="/var/lib/kubelet/pods/365ec886-8ed3-4c64-a794-502bfef4fedf/volumes" Oct 09 10:54:36 crc kubenswrapper[4740]: I1009 10:54:36.033090 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-dc53-account-create-56z9f"] Oct 09 10:54:36 crc kubenswrapper[4740]: I1009 10:54:36.041133 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-dc53-account-create-56z9f"] Oct 09 10:54:37 crc kubenswrapper[4740]: I1009 10:54:37.028928 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4f86-account-create-b7lmp"] Oct 09 10:54:37 crc kubenswrapper[4740]: I1009 10:54:37.042847 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4f86-account-create-b7lmp"] Oct 09 10:54:37 crc kubenswrapper[4740]: I1009 10:54:37.773805 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da198044-a030-4982-bdb9-9a232c4a1191" path="/var/lib/kubelet/pods/da198044-a030-4982-bdb9-9a232c4a1191/volumes" Oct 09 10:54:37 crc kubenswrapper[4740]: I1009 10:54:37.775684 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbcc00a8-003c-48f3-b7e5-5bade54830fe" path="/var/lib/kubelet/pods/dbcc00a8-003c-48f3-b7e5-5bade54830fe/volumes" Oct 09 10:54:46 crc kubenswrapper[4740]: I1009 10:54:46.754129 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:54:46 crc kubenswrapper[4740]: E1009 10:54:46.756062 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:55:00 crc kubenswrapper[4740]: I1009 10:55:00.753698 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:55:00 crc kubenswrapper[4740]: E1009 10:55:00.754722 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.637315 4740 scope.go:117] "RemoveContainer" containerID="5947c915f15c763c4aa2bccee5f0ac63b696b4ba5de58ec4500d3d837709bf20" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.667938 4740 scope.go:117] "RemoveContainer" containerID="afa0bcca481f8ce37f904baba840c3e1bad385de4188ae6934673bd790799008" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.727342 4740 scope.go:117] "RemoveContainer" containerID="e95a494f4cec7122196c5455a913e1f022e67b48ab06ea717a5487b01b4353a2" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.779132 4740 scope.go:117] "RemoveContainer" containerID="1f1e975d16d493d8fdbe5f5ab458c032d2611afb170e4e49909e38dfdb50620c" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.851773 4740 scope.go:117] "RemoveContainer" containerID="146ae6f0c7b06a3d9faf9183a4eb846527d25f121b313170f7b786d828402864" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.877243 4740 scope.go:117] "RemoveContainer" containerID="1c1cda55d07490473d5da1be62c95e7b64c81eb399c4c134772c5887bd60efb0" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.923377 4740 scope.go:117] "RemoveContainer" containerID="50abcfd8c11eac6174bc771ce70680b1bd07a5a3b6b353876701c1c09e71815c" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.943809 4740 scope.go:117] "RemoveContainer" containerID="cf09e3d7980f8112f9248ddd31242a6090428f7c3bf95ae6fb5ba6ee2994cf74" Oct 09 10:55:10 crc kubenswrapper[4740]: I1009 10:55:10.976271 4740 scope.go:117] "RemoveContainer" containerID="3fd82077a9bea0733b9951d5635c15b01d30d6035800b1115bbc23f7e25999f4" Oct 09 10:55:12 crc kubenswrapper[4740]: I1009 10:55:12.037138 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c7a9-account-create-6np7n"] Oct 09 10:55:12 crc kubenswrapper[4740]: I1009 10:55:12.046993 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c7a9-account-create-6np7n"] Oct 09 10:55:13 crc kubenswrapper[4740]: I1009 10:55:13.775433 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5caee6ca-48fd-48a5-b84c-81d04b03a650" path="/var/lib/kubelet/pods/5caee6ca-48fd-48a5-b84c-81d04b03a650/volumes" Oct 09 10:55:14 crc kubenswrapper[4740]: I1009 10:55:14.038367 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dzzbk"] Oct 09 10:55:14 crc kubenswrapper[4740]: I1009 10:55:14.047565 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dzzbk"] Oct 09 10:55:14 crc kubenswrapper[4740]: I1009 10:55:14.753941 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:55:14 crc kubenswrapper[4740]: E1009 10:55:14.754285 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:55:15 crc kubenswrapper[4740]: I1009 10:55:15.771024 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be99ba98-fb4b-4609-986e-3636a4a8f244" path="/var/lib/kubelet/pods/be99ba98-fb4b-4609-986e-3636a4a8f244/volumes" Oct 09 10:55:20 crc kubenswrapper[4740]: I1009 10:55:20.033645 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wkc7b"] Oct 09 10:55:20 crc kubenswrapper[4740]: I1009 10:55:20.043467 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wkc7b"] Oct 09 10:55:21 crc kubenswrapper[4740]: I1009 10:55:21.033716 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ktvhb"] Oct 09 10:55:21 crc kubenswrapper[4740]: I1009 10:55:21.042884 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ktvhb"] Oct 09 10:55:21 crc kubenswrapper[4740]: I1009 10:55:21.765692 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4808b047-cb78-4910-8c22-65514e99c2cc" path="/var/lib/kubelet/pods/4808b047-cb78-4910-8c22-65514e99c2cc/volumes" Oct 09 10:55:21 crc kubenswrapper[4740]: I1009 10:55:21.766286 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a8fb50-724c-4b07-83e2-71d8ee90cb05" path="/var/lib/kubelet/pods/71a8fb50-724c-4b07-83e2-71d8ee90cb05/volumes" Oct 09 10:55:27 crc kubenswrapper[4740]: I1009 10:55:27.754089 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:55:27 crc kubenswrapper[4740]: E1009 10:55:27.754936 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:55:33 crc kubenswrapper[4740]: I1009 10:55:33.044628 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mw6z4"] Oct 09 10:55:33 crc kubenswrapper[4740]: I1009 10:55:33.051896 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mw6z4"] Oct 09 10:55:33 crc kubenswrapper[4740]: I1009 10:55:33.767733 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3062e734-0f07-4e8f-862e-a2906e7bbbd5" path="/var/lib/kubelet/pods/3062e734-0f07-4e8f-862e-a2906e7bbbd5/volumes" Oct 09 10:55:39 crc kubenswrapper[4740]: I1009 10:55:39.124565 4740 generic.go:334] "Generic (PLEG): container finished" podID="091c1607-1916-4dfd-9e3d-95dbe5534e98" containerID="7f3dadff1b072bf9cd5fa1edf4ab6f8b3db378fd325386594df40e22cab0a3a8" exitCode=0 Oct 09 10:55:39 crc kubenswrapper[4740]: I1009 10:55:39.124646 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" event={"ID":"091c1607-1916-4dfd-9e3d-95dbe5534e98","Type":"ContainerDied","Data":"7f3dadff1b072bf9cd5fa1edf4ab6f8b3db378fd325386594df40e22cab0a3a8"} Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.572298 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.654420 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-ssh-key\") pod \"091c1607-1916-4dfd-9e3d-95dbe5534e98\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.654482 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5hdv\" (UniqueName: \"kubernetes.io/projected/091c1607-1916-4dfd-9e3d-95dbe5534e98-kube-api-access-c5hdv\") pod \"091c1607-1916-4dfd-9e3d-95dbe5534e98\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.654652 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-inventory\") pod \"091c1607-1916-4dfd-9e3d-95dbe5534e98\" (UID: \"091c1607-1916-4dfd-9e3d-95dbe5534e98\") " Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.660968 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091c1607-1916-4dfd-9e3d-95dbe5534e98-kube-api-access-c5hdv" (OuterVolumeSpecName: "kube-api-access-c5hdv") pod "091c1607-1916-4dfd-9e3d-95dbe5534e98" (UID: "091c1607-1916-4dfd-9e3d-95dbe5534e98"). InnerVolumeSpecName "kube-api-access-c5hdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.685371 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-inventory" (OuterVolumeSpecName: "inventory") pod "091c1607-1916-4dfd-9e3d-95dbe5534e98" (UID: "091c1607-1916-4dfd-9e3d-95dbe5534e98"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.689025 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "091c1607-1916-4dfd-9e3d-95dbe5534e98" (UID: "091c1607-1916-4dfd-9e3d-95dbe5534e98"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.757352 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.757398 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/091c1607-1916-4dfd-9e3d-95dbe5534e98-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:55:40 crc kubenswrapper[4740]: I1009 10:55:40.757411 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5hdv\" (UniqueName: \"kubernetes.io/projected/091c1607-1916-4dfd-9e3d-95dbe5534e98-kube-api-access-c5hdv\") on node \"crc\" DevicePath \"\"" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.148531 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" event={"ID":"091c1607-1916-4dfd-9e3d-95dbe5534e98","Type":"ContainerDied","Data":"8b5909345ca6fcb4ae10464420869c177321d14aabb42fc4c140d2be9dac19a0"} Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.148581 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b5909345ca6fcb4ae10464420869c177321d14aabb42fc4c140d2be9dac19a0" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.148698 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wndn5" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.268718 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq"] Oct 09 10:55:41 crc kubenswrapper[4740]: E1009 10:55:41.269390 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091c1607-1916-4dfd-9e3d-95dbe5534e98" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.269416 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="091c1607-1916-4dfd-9e3d-95dbe5534e98" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.269675 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="091c1607-1916-4dfd-9e3d-95dbe5534e98" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.275167 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.275183 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq"] Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.278511 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.278672 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.278619 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.282449 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.303580 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.304093 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.304361 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skd8p\" (UniqueName: \"kubernetes.io/projected/19338c28-ee36-4273-8f74-f34767a3fcb1-kube-api-access-skd8p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.407737 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.407859 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skd8p\" (UniqueName: \"kubernetes.io/projected/19338c28-ee36-4273-8f74-f34767a3fcb1-kube-api-access-skd8p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.407906 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.419589 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.434287 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.445488 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skd8p\" (UniqueName: \"kubernetes.io/projected/19338c28-ee36-4273-8f74-f34767a3fcb1-kube-api-access-skd8p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.601988 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:41 crc kubenswrapper[4740]: I1009 10:55:41.759641 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:55:41 crc kubenswrapper[4740]: E1009 10:55:41.760382 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:55:42 crc kubenswrapper[4740]: I1009 10:55:42.116636 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq"] Oct 09 10:55:42 crc kubenswrapper[4740]: I1009 10:55:42.157882 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" event={"ID":"19338c28-ee36-4273-8f74-f34767a3fcb1","Type":"ContainerStarted","Data":"2f31626830f9561481d4e6a1e312e85f05063607eb71f692bfaebb0e3b59c634"} Oct 09 10:55:42 crc kubenswrapper[4740]: I1009 10:55:42.608855 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:55:43 crc kubenswrapper[4740]: I1009 10:55:43.168444 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" event={"ID":"19338c28-ee36-4273-8f74-f34767a3fcb1","Type":"ContainerStarted","Data":"99383cfb677b0ee0e0d328a48c061d4c835fb4128b73fb34c67a51e161063693"} Oct 09 10:55:43 crc kubenswrapper[4740]: I1009 10:55:43.186479 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" podStartSLOduration=1.706824613 podStartE2EDuration="2.186453647s" podCreationTimestamp="2025-10-09 10:55:41 +0000 UTC" firstStartedPulling="2025-10-09 10:55:42.126379377 +0000 UTC m=+1681.088579758" lastFinishedPulling="2025-10-09 10:55:42.606008401 +0000 UTC m=+1681.568208792" observedRunningTime="2025-10-09 10:55:43.184999847 +0000 UTC m=+1682.147200238" watchObservedRunningTime="2025-10-09 10:55:43.186453647 +0000 UTC m=+1682.148654058" Oct 09 10:55:44 crc kubenswrapper[4740]: I1009 10:55:44.029098 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-f8w6m"] Oct 09 10:55:44 crc kubenswrapper[4740]: I1009 10:55:44.037474 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-f8w6m"] Oct 09 10:55:45 crc kubenswrapper[4740]: I1009 10:55:45.765477 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb96e05e-80fb-4eec-b609-123ed43152ae" path="/var/lib/kubelet/pods/fb96e05e-80fb-4eec-b609-123ed43152ae/volumes" Oct 09 10:55:48 crc kubenswrapper[4740]: I1009 10:55:48.214539 4740 generic.go:334] "Generic (PLEG): container finished" podID="19338c28-ee36-4273-8f74-f34767a3fcb1" containerID="99383cfb677b0ee0e0d328a48c061d4c835fb4128b73fb34c67a51e161063693" exitCode=0 Oct 09 10:55:48 crc kubenswrapper[4740]: I1009 10:55:48.214590 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" event={"ID":"19338c28-ee36-4273-8f74-f34767a3fcb1","Type":"ContainerDied","Data":"99383cfb677b0ee0e0d328a48c061d4c835fb4128b73fb34c67a51e161063693"} Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.675350 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.869330 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-inventory\") pod \"19338c28-ee36-4273-8f74-f34767a3fcb1\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.869633 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skd8p\" (UniqueName: \"kubernetes.io/projected/19338c28-ee36-4273-8f74-f34767a3fcb1-kube-api-access-skd8p\") pod \"19338c28-ee36-4273-8f74-f34767a3fcb1\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.869719 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-ssh-key\") pod \"19338c28-ee36-4273-8f74-f34767a3fcb1\" (UID: \"19338c28-ee36-4273-8f74-f34767a3fcb1\") " Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.879017 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19338c28-ee36-4273-8f74-f34767a3fcb1-kube-api-access-skd8p" (OuterVolumeSpecName: "kube-api-access-skd8p") pod "19338c28-ee36-4273-8f74-f34767a3fcb1" (UID: "19338c28-ee36-4273-8f74-f34767a3fcb1"). InnerVolumeSpecName "kube-api-access-skd8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.903546 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-inventory" (OuterVolumeSpecName: "inventory") pod "19338c28-ee36-4273-8f74-f34767a3fcb1" (UID: "19338c28-ee36-4273-8f74-f34767a3fcb1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.906063 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "19338c28-ee36-4273-8f74-f34767a3fcb1" (UID: "19338c28-ee36-4273-8f74-f34767a3fcb1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.972705 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.972786 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19338c28-ee36-4273-8f74-f34767a3fcb1-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:55:49 crc kubenswrapper[4740]: I1009 10:55:49.972808 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skd8p\" (UniqueName: \"kubernetes.io/projected/19338c28-ee36-4273-8f74-f34767a3fcb1-kube-api-access-skd8p\") on node \"crc\" DevicePath \"\"" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.236236 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" event={"ID":"19338c28-ee36-4273-8f74-f34767a3fcb1","Type":"ContainerDied","Data":"2f31626830f9561481d4e6a1e312e85f05063607eb71f692bfaebb0e3b59c634"} Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.236281 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f31626830f9561481d4e6a1e312e85f05063607eb71f692bfaebb0e3b59c634" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.236300 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.316232 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7"] Oct 09 10:55:50 crc kubenswrapper[4740]: E1009 10:55:50.316847 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19338c28-ee36-4273-8f74-f34767a3fcb1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.316879 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="19338c28-ee36-4273-8f74-f34767a3fcb1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.317237 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="19338c28-ee36-4273-8f74-f34767a3fcb1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.318291 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.321231 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.321901 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.322267 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.322413 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.327928 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7"] Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.482886 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.482938 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.483014 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg4vj\" (UniqueName: \"kubernetes.io/projected/52c814fd-0700-4e3e-8302-19324617f7c5-kube-api-access-bg4vj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.584420 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg4vj\" (UniqueName: \"kubernetes.io/projected/52c814fd-0700-4e3e-8302-19324617f7c5-kube-api-access-bg4vj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.584734 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.584855 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.592010 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.592496 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.609307 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg4vj\" (UniqueName: \"kubernetes.io/projected/52c814fd-0700-4e3e-8302-19324617f7c5-kube-api-access-bg4vj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8fhb7\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:50 crc kubenswrapper[4740]: I1009 10:55:50.690692 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:55:51 crc kubenswrapper[4740]: I1009 10:55:51.276089 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7"] Oct 09 10:55:51 crc kubenswrapper[4740]: W1009 10:55:51.280206 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52c814fd_0700_4e3e_8302_19324617f7c5.slice/crio-a1bb101531205c2a6b6da93b13690cc928a70f644c57b6950d2d99e2c0f623d9 WatchSource:0}: Error finding container a1bb101531205c2a6b6da93b13690cc928a70f644c57b6950d2d99e2c0f623d9: Status 404 returned error can't find the container with id a1bb101531205c2a6b6da93b13690cc928a70f644c57b6950d2d99e2c0f623d9 Oct 09 10:55:52 crc kubenswrapper[4740]: I1009 10:55:52.255700 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" event={"ID":"52c814fd-0700-4e3e-8302-19324617f7c5","Type":"ContainerStarted","Data":"32976c499001e1cbbeb56103b2ded98eccdf36f447f6e92ea1a013645b21191b"} Oct 09 10:55:52 crc kubenswrapper[4740]: I1009 10:55:52.256304 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" event={"ID":"52c814fd-0700-4e3e-8302-19324617f7c5","Type":"ContainerStarted","Data":"a1bb101531205c2a6b6da93b13690cc928a70f644c57b6950d2d99e2c0f623d9"} Oct 09 10:55:52 crc kubenswrapper[4740]: I1009 10:55:52.272896 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" podStartSLOduration=1.839804558 podStartE2EDuration="2.2728673s" podCreationTimestamp="2025-10-09 10:55:50 +0000 UTC" firstStartedPulling="2025-10-09 10:55:51.282727605 +0000 UTC m=+1690.244928026" lastFinishedPulling="2025-10-09 10:55:51.715790377 +0000 UTC m=+1690.677990768" observedRunningTime="2025-10-09 10:55:52.270017982 +0000 UTC m=+1691.232218363" watchObservedRunningTime="2025-10-09 10:55:52.2728673 +0000 UTC m=+1691.235067681" Oct 09 10:55:56 crc kubenswrapper[4740]: I1009 10:55:56.754225 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:55:56 crc kubenswrapper[4740]: E1009 10:55:56.754837 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:56:07 crc kubenswrapper[4740]: I1009 10:56:07.033211 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kcxlk"] Oct 09 10:56:07 crc kubenswrapper[4740]: I1009 10:56:07.046069 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-nbhjc"] Oct 09 10:56:07 crc kubenswrapper[4740]: I1009 10:56:07.053448 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kcxlk"] Oct 09 10:56:07 crc kubenswrapper[4740]: I1009 10:56:07.059918 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-nbhjc"] Oct 09 10:56:07 crc kubenswrapper[4740]: I1009 10:56:07.768834 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b317805-3df8-459e-a489-955c34dfb3d7" path="/var/lib/kubelet/pods/7b317805-3df8-459e-a489-955c34dfb3d7/volumes" Oct 09 10:56:07 crc kubenswrapper[4740]: I1009 10:56:07.769415 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1437726-4284-4da9-a89e-d68ac67b5546" path="/var/lib/kubelet/pods/d1437726-4284-4da9-a89e-d68ac67b5546/volumes" Oct 09 10:56:08 crc kubenswrapper[4740]: I1009 10:56:08.028660 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mx6td"] Oct 09 10:56:08 crc kubenswrapper[4740]: I1009 10:56:08.040001 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mx6td"] Oct 09 10:56:09 crc kubenswrapper[4740]: I1009 10:56:09.763405 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d8e6e3-8b07-45b3-8967-3eab12dab011" path="/var/lib/kubelet/pods/51d8e6e3-8b07-45b3-8967-3eab12dab011/volumes" Oct 09 10:56:10 crc kubenswrapper[4740]: I1009 10:56:10.754939 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:56:10 crc kubenswrapper[4740]: E1009 10:56:10.755469 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.137084 4740 scope.go:117] "RemoveContainer" containerID="08a803f91f066820fd59c7e3e405a1c4934c77c751a2c0dc019586ab31cbf341" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.205237 4740 scope.go:117] "RemoveContainer" containerID="ca08e689127b608fcdc76804ee25ac188b19c9ba210d1536ab9fb62b1ddbde5a" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.252299 4740 scope.go:117] "RemoveContainer" containerID="5b75be64d23dd84ced609d4b115b958cb2dfaf49a99ac0c3602d3d7777d78eb4" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.336478 4740 scope.go:117] "RemoveContainer" containerID="af9dc27255d65f1bf502f880090b93e80c3869a57ba2a8c0d714eb227ee24b90" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.369089 4740 scope.go:117] "RemoveContainer" containerID="b8ab9aba5a57457832fd6b9e6c262c311855d895d6d4aecc56f1305481ffa62e" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.407393 4740 scope.go:117] "RemoveContainer" containerID="10035550e5c5e9edbc0f68f1e00132b0ba979f7e45b5a19fcc5c069c73e6b908" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.467666 4740 scope.go:117] "RemoveContainer" containerID="e83d5bb2dc4d7b48caa48225f8f2e9b1f3a576acb8ab58452fb925c2655a5947" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.490046 4740 scope.go:117] "RemoveContainer" containerID="4b2dffbb6ac734e76b4e18a760104669ebd7c2c3141819ef59a5116f09a77f27" Oct 09 10:56:11 crc kubenswrapper[4740]: I1009 10:56:11.507382 4740 scope.go:117] "RemoveContainer" containerID="a9428d4dec3d0dd3758cf0ec40f2ff733ad0913f706f5f38a629023ae5b65b63" Oct 09 10:56:22 crc kubenswrapper[4740]: I1009 10:56:22.031674 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c064-account-create-h5v2f"] Oct 09 10:56:22 crc kubenswrapper[4740]: I1009 10:56:22.045851 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-92ee-account-create-jdtxn"] Oct 09 10:56:22 crc kubenswrapper[4740]: I1009 10:56:22.052504 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3d7e-account-create-g7zgb"] Oct 09 10:56:22 crc kubenswrapper[4740]: I1009 10:56:22.059002 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c064-account-create-h5v2f"] Oct 09 10:56:22 crc kubenswrapper[4740]: I1009 10:56:22.064886 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3d7e-account-create-g7zgb"] Oct 09 10:56:22 crc kubenswrapper[4740]: I1009 10:56:22.071102 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-92ee-account-create-jdtxn"] Oct 09 10:56:22 crc kubenswrapper[4740]: I1009 10:56:22.754370 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:56:22 crc kubenswrapper[4740]: E1009 10:56:22.754641 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:56:23 crc kubenswrapper[4740]: I1009 10:56:23.788118 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e9966b-77cc-4158-8cb0-703ee3cb30f5" path="/var/lib/kubelet/pods/57e9966b-77cc-4158-8cb0-703ee3cb30f5/volumes" Oct 09 10:56:23 crc kubenswrapper[4740]: I1009 10:56:23.790129 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb294930-8e11-4d2a-8965-6451b647fb16" path="/var/lib/kubelet/pods/cb294930-8e11-4d2a-8965-6451b647fb16/volumes" Oct 09 10:56:23 crc kubenswrapper[4740]: I1009 10:56:23.791393 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d55cea-26c8-45cc-a7fd-c741620a164a" path="/var/lib/kubelet/pods/d6d55cea-26c8-45cc-a7fd-c741620a164a/volumes" Oct 09 10:56:26 crc kubenswrapper[4740]: I1009 10:56:26.600549 4740 generic.go:334] "Generic (PLEG): container finished" podID="52c814fd-0700-4e3e-8302-19324617f7c5" containerID="32976c499001e1cbbeb56103b2ded98eccdf36f447f6e92ea1a013645b21191b" exitCode=0 Oct 09 10:56:26 crc kubenswrapper[4740]: I1009 10:56:26.600699 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" event={"ID":"52c814fd-0700-4e3e-8302-19324617f7c5","Type":"ContainerDied","Data":"32976c499001e1cbbeb56103b2ded98eccdf36f447f6e92ea1a013645b21191b"} Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.018359 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.137627 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-ssh-key\") pod \"52c814fd-0700-4e3e-8302-19324617f7c5\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.138037 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg4vj\" (UniqueName: \"kubernetes.io/projected/52c814fd-0700-4e3e-8302-19324617f7c5-kube-api-access-bg4vj\") pod \"52c814fd-0700-4e3e-8302-19324617f7c5\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.138319 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-inventory\") pod \"52c814fd-0700-4e3e-8302-19324617f7c5\" (UID: \"52c814fd-0700-4e3e-8302-19324617f7c5\") " Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.143188 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c814fd-0700-4e3e-8302-19324617f7c5-kube-api-access-bg4vj" (OuterVolumeSpecName: "kube-api-access-bg4vj") pod "52c814fd-0700-4e3e-8302-19324617f7c5" (UID: "52c814fd-0700-4e3e-8302-19324617f7c5"). InnerVolumeSpecName "kube-api-access-bg4vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.164447 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-inventory" (OuterVolumeSpecName: "inventory") pod "52c814fd-0700-4e3e-8302-19324617f7c5" (UID: "52c814fd-0700-4e3e-8302-19324617f7c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.170645 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "52c814fd-0700-4e3e-8302-19324617f7c5" (UID: "52c814fd-0700-4e3e-8302-19324617f7c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.240414 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.240455 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg4vj\" (UniqueName: \"kubernetes.io/projected/52c814fd-0700-4e3e-8302-19324617f7c5-kube-api-access-bg4vj\") on node \"crc\" DevicePath \"\"" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.240471 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52c814fd-0700-4e3e-8302-19324617f7c5-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.618561 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" event={"ID":"52c814fd-0700-4e3e-8302-19324617f7c5","Type":"ContainerDied","Data":"a1bb101531205c2a6b6da93b13690cc928a70f644c57b6950d2d99e2c0f623d9"} Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.618597 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1bb101531205c2a6b6da93b13690cc928a70f644c57b6950d2d99e2c0f623d9" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.619183 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8fhb7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.775253 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7"] Oct 09 10:56:28 crc kubenswrapper[4740]: E1009 10:56:28.775780 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c814fd-0700-4e3e-8302-19324617f7c5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.775808 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c814fd-0700-4e3e-8302-19324617f7c5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.776024 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c814fd-0700-4e3e-8302-19324617f7c5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.776855 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.778918 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.781238 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.781956 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.782038 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.787471 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7"] Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.874141 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.874237 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf998\" (UniqueName: \"kubernetes.io/projected/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-kube-api-access-kf998\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.874329 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.975771 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf998\" (UniqueName: \"kubernetes.io/projected/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-kube-api-access-kf998\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.976138 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.976279 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.980376 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.981252 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:28 crc kubenswrapper[4740]: I1009 10:56:28.992122 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf998\" (UniqueName: \"kubernetes.io/projected/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-kube-api-access-kf998\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:29 crc kubenswrapper[4740]: I1009 10:56:29.098393 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:56:29 crc kubenswrapper[4740]: I1009 10:56:29.615136 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7"] Oct 09 10:56:29 crc kubenswrapper[4740]: I1009 10:56:29.628686 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" event={"ID":"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0","Type":"ContainerStarted","Data":"b9c091d3173349c50722b744ce5172696b83efacd36b8f1cb92c9959fa4c355f"} Oct 09 10:56:30 crc kubenswrapper[4740]: I1009 10:56:30.639847 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" event={"ID":"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0","Type":"ContainerStarted","Data":"e667ad2bbead5533193d5572de95aae08e7c062c4cb1d2eb4231cb8485f23df1"} Oct 09 10:56:30 crc kubenswrapper[4740]: I1009 10:56:30.662473 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" podStartSLOduration=2.201181989 podStartE2EDuration="2.662450875s" podCreationTimestamp="2025-10-09 10:56:28 +0000 UTC" firstStartedPulling="2025-10-09 10:56:29.619709638 +0000 UTC m=+1728.581910019" lastFinishedPulling="2025-10-09 10:56:30.080978484 +0000 UTC m=+1729.043178905" observedRunningTime="2025-10-09 10:56:30.655940647 +0000 UTC m=+1729.618141038" watchObservedRunningTime="2025-10-09 10:56:30.662450875 +0000 UTC m=+1729.624651256" Oct 09 10:56:33 crc kubenswrapper[4740]: I1009 10:56:33.754704 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:56:33 crc kubenswrapper[4740]: E1009 10:56:33.755719 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:56:46 crc kubenswrapper[4740]: I1009 10:56:46.046301 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gwq7v"] Oct 09 10:56:46 crc kubenswrapper[4740]: I1009 10:56:46.059245 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gwq7v"] Oct 09 10:56:47 crc kubenswrapper[4740]: I1009 10:56:47.765524 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69aabb5-1a08-483b-b60b-65080c36912c" path="/var/lib/kubelet/pods/a69aabb5-1a08-483b-b60b-65080c36912c/volumes" Oct 09 10:56:48 crc kubenswrapper[4740]: I1009 10:56:48.754390 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:56:48 crc kubenswrapper[4740]: E1009 10:56:48.755169 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:56:59 crc kubenswrapper[4740]: I1009 10:56:59.753982 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:56:59 crc kubenswrapper[4740]: E1009 10:56:59.754719 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:57:09 crc kubenswrapper[4740]: I1009 10:57:09.071405 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-c7th8"] Oct 09 10:57:09 crc kubenswrapper[4740]: I1009 10:57:09.084943 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-c7th8"] Oct 09 10:57:09 crc kubenswrapper[4740]: I1009 10:57:09.764612 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af75d27-96e9-44d0-95cc-d0137b792f96" path="/var/lib/kubelet/pods/6af75d27-96e9-44d0-95cc-d0137b792f96/volumes" Oct 09 10:57:10 crc kubenswrapper[4740]: I1009 10:57:10.031204 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kncvx"] Oct 09 10:57:10 crc kubenswrapper[4740]: I1009 10:57:10.043024 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kncvx"] Oct 09 10:57:11 crc kubenswrapper[4740]: I1009 10:57:11.672004 4740 scope.go:117] "RemoveContainer" containerID="2fdd68a273eb8a46c593e136d890dc23d53405150f2a6f0cbbfa25cd415f3f83" Oct 09 10:57:11 crc kubenswrapper[4740]: I1009 10:57:11.711878 4740 scope.go:117] "RemoveContainer" containerID="11b89eb5f162c845cf376342103dc4a861f7efe4c7797fe301820df30e6a64fb" Oct 09 10:57:11 crc kubenswrapper[4740]: I1009 10:57:11.759442 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:57:11 crc kubenswrapper[4740]: E1009 10:57:11.759685 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:57:11 crc kubenswrapper[4740]: I1009 10:57:11.768873 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f30f7b-d441-4f72-aa2b-9fd450738e6d" path="/var/lib/kubelet/pods/82f30f7b-d441-4f72-aa2b-9fd450738e6d/volumes" Oct 09 10:57:11 crc kubenswrapper[4740]: I1009 10:57:11.776018 4740 scope.go:117] "RemoveContainer" containerID="076989a1e587d3337b54a7081133dd69bf8bef51967206bbb774a5c8d3669522" Oct 09 10:57:11 crc kubenswrapper[4740]: I1009 10:57:11.847747 4740 scope.go:117] "RemoveContainer" containerID="2f0c772a86b0ec45f94d9955b092d9053a8a3b47a4d6b1957f73a9b0f76ae1d3" Oct 09 10:57:11 crc kubenswrapper[4740]: I1009 10:57:11.867490 4740 scope.go:117] "RemoveContainer" containerID="7423e2ae9cc7d138508f764b745adfcca9c4d4c73313b74e561ecafeef343599" Oct 09 10:57:22 crc kubenswrapper[4740]: I1009 10:57:22.753657 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:57:22 crc kubenswrapper[4740]: E1009 10:57:22.754353 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:57:25 crc kubenswrapper[4740]: I1009 10:57:25.132041 4740 generic.go:334] "Generic (PLEG): container finished" podID="d13c7792-e2d1-4ce2-b965-f77bd77b0cd0" containerID="e667ad2bbead5533193d5572de95aae08e7c062c4cb1d2eb4231cb8485f23df1" exitCode=2 Oct 09 10:57:25 crc kubenswrapper[4740]: I1009 10:57:25.132100 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" event={"ID":"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0","Type":"ContainerDied","Data":"e667ad2bbead5533193d5572de95aae08e7c062c4cb1d2eb4231cb8485f23df1"} Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.566362 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.698895 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-ssh-key\") pod \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.699155 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-inventory\") pod \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.699199 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf998\" (UniqueName: \"kubernetes.io/projected/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-kube-api-access-kf998\") pod \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\" (UID: \"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0\") " Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.705682 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-kube-api-access-kf998" (OuterVolumeSpecName: "kube-api-access-kf998") pod "d13c7792-e2d1-4ce2-b965-f77bd77b0cd0" (UID: "d13c7792-e2d1-4ce2-b965-f77bd77b0cd0"). InnerVolumeSpecName "kube-api-access-kf998". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.745765 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d13c7792-e2d1-4ce2-b965-f77bd77b0cd0" (UID: "d13c7792-e2d1-4ce2-b965-f77bd77b0cd0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.753007 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-inventory" (OuterVolumeSpecName: "inventory") pod "d13c7792-e2d1-4ce2-b965-f77bd77b0cd0" (UID: "d13c7792-e2d1-4ce2-b965-f77bd77b0cd0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.801028 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.801063 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf998\" (UniqueName: \"kubernetes.io/projected/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-kube-api-access-kf998\") on node \"crc\" DevicePath \"\"" Oct 09 10:57:26 crc kubenswrapper[4740]: I1009 10:57:26.801078 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d13c7792-e2d1-4ce2-b965-f77bd77b0cd0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:57:27 crc kubenswrapper[4740]: I1009 10:57:27.155060 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" event={"ID":"d13c7792-e2d1-4ce2-b965-f77bd77b0cd0","Type":"ContainerDied","Data":"b9c091d3173349c50722b744ce5172696b83efacd36b8f1cb92c9959fa4c355f"} Oct 09 10:57:27 crc kubenswrapper[4740]: I1009 10:57:27.155112 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c091d3173349c50722b744ce5172696b83efacd36b8f1cb92c9959fa4c355f" Oct 09 10:57:27 crc kubenswrapper[4740]: I1009 10:57:27.155208 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.037904 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr"] Oct 09 10:57:34 crc kubenswrapper[4740]: E1009 10:57:34.039246 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13c7792-e2d1-4ce2-b965-f77bd77b0cd0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.039275 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13c7792-e2d1-4ce2-b965-f77bd77b0cd0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.039623 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13c7792-e2d1-4ce2-b965-f77bd77b0cd0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.040904 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.052877 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr"] Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.102743 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.102956 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.103026 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.106023 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.203457 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.203550 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.203606 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv8ln\" (UniqueName: \"kubernetes.io/projected/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-kube-api-access-tv8ln\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.305393 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv8ln\" (UniqueName: \"kubernetes.io/projected/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-kube-api-access-tv8ln\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.305667 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.305737 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.312179 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.312995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.329040 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv8ln\" (UniqueName: \"kubernetes.io/projected/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-kube-api-access-tv8ln\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.426216 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.754066 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:57:34 crc kubenswrapper[4740]: E1009 10:57:34.754694 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:57:34 crc kubenswrapper[4740]: I1009 10:57:34.957028 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr"] Oct 09 10:57:35 crc kubenswrapper[4740]: I1009 10:57:35.227372 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" event={"ID":"6b62efe3-f320-4b06-9b4f-6cdebea2c83c","Type":"ContainerStarted","Data":"140732b81ac0596387e9e3ca1a5593c0c9cc63f5dc03404bb61e3bce2f096e40"} Oct 09 10:57:36 crc kubenswrapper[4740]: I1009 10:57:36.236581 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" event={"ID":"6b62efe3-f320-4b06-9b4f-6cdebea2c83c","Type":"ContainerStarted","Data":"0c4d023847fda56043580d03f9edaa2064eee4426b78d342fbad8b64ba1c754f"} Oct 09 10:57:36 crc kubenswrapper[4740]: I1009 10:57:36.254436 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" podStartSLOduration=1.789338765 podStartE2EDuration="2.254420205s" podCreationTimestamp="2025-10-09 10:57:34 +0000 UTC" firstStartedPulling="2025-10-09 10:57:34.97353403 +0000 UTC m=+1793.935734411" lastFinishedPulling="2025-10-09 10:57:35.43861547 +0000 UTC m=+1794.400815851" observedRunningTime="2025-10-09 10:57:36.251824874 +0000 UTC m=+1795.214025265" watchObservedRunningTime="2025-10-09 10:57:36.254420205 +0000 UTC m=+1795.216620586" Oct 09 10:57:45 crc kubenswrapper[4740]: I1009 10:57:45.753886 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:57:45 crc kubenswrapper[4740]: E1009 10:57:45.756152 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:57:53 crc kubenswrapper[4740]: I1009 10:57:53.048860 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xvz9n"] Oct 09 10:57:53 crc kubenswrapper[4740]: I1009 10:57:53.058602 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xvz9n"] Oct 09 10:57:53 crc kubenswrapper[4740]: I1009 10:57:53.781033 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b098d98-b0c4-46f4-b79b-57a6405f0385" path="/var/lib/kubelet/pods/5b098d98-b0c4-46f4-b79b-57a6405f0385/volumes" Oct 09 10:57:56 crc kubenswrapper[4740]: I1009 10:57:56.753783 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:57:56 crc kubenswrapper[4740]: E1009 10:57:56.755095 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 10:58:10 crc kubenswrapper[4740]: I1009 10:58:10.753934 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 10:58:11 crc kubenswrapper[4740]: I1009 10:58:11.573263 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"3448287b6cd68c3403bb27caa7100e27359be5b949fa1d87e08098aaeba8b363"} Oct 09 10:58:12 crc kubenswrapper[4740]: I1009 10:58:12.042608 4740 scope.go:117] "RemoveContainer" containerID="57943945125a869628425fad9f5d33c104a0605c5da0aa105261f713413e4840" Oct 09 10:58:12 crc kubenswrapper[4740]: I1009 10:58:12.085597 4740 scope.go:117] "RemoveContainer" containerID="3c6b520bce85df793bad710b51aded2089c1f86036f6e559def420a2218311ce" Oct 09 10:58:20 crc kubenswrapper[4740]: I1009 10:58:20.665972 4740 generic.go:334] "Generic (PLEG): container finished" podID="6b62efe3-f320-4b06-9b4f-6cdebea2c83c" containerID="0c4d023847fda56043580d03f9edaa2064eee4426b78d342fbad8b64ba1c754f" exitCode=0 Oct 09 10:58:20 crc kubenswrapper[4740]: I1009 10:58:20.666253 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" event={"ID":"6b62efe3-f320-4b06-9b4f-6cdebea2c83c","Type":"ContainerDied","Data":"0c4d023847fda56043580d03f9edaa2064eee4426b78d342fbad8b64ba1c754f"} Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.112370 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.231428 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv8ln\" (UniqueName: \"kubernetes.io/projected/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-kube-api-access-tv8ln\") pod \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.231543 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-inventory\") pod \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.231820 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-ssh-key\") pod \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\" (UID: \"6b62efe3-f320-4b06-9b4f-6cdebea2c83c\") " Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.237672 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-kube-api-access-tv8ln" (OuterVolumeSpecName: "kube-api-access-tv8ln") pod "6b62efe3-f320-4b06-9b4f-6cdebea2c83c" (UID: "6b62efe3-f320-4b06-9b4f-6cdebea2c83c"). InnerVolumeSpecName "kube-api-access-tv8ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.257875 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b62efe3-f320-4b06-9b4f-6cdebea2c83c" (UID: "6b62efe3-f320-4b06-9b4f-6cdebea2c83c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.265217 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-inventory" (OuterVolumeSpecName: "inventory") pod "6b62efe3-f320-4b06-9b4f-6cdebea2c83c" (UID: "6b62efe3-f320-4b06-9b4f-6cdebea2c83c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.334422 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.334469 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv8ln\" (UniqueName: \"kubernetes.io/projected/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-kube-api-access-tv8ln\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.334486 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b62efe3-f320-4b06-9b4f-6cdebea2c83c-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.708932 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" event={"ID":"6b62efe3-f320-4b06-9b4f-6cdebea2c83c","Type":"ContainerDied","Data":"140732b81ac0596387e9e3ca1a5593c0c9cc63f5dc03404bb61e3bce2f096e40"} Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.709359 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="140732b81ac0596387e9e3ca1a5593c0c9cc63f5dc03404bb61e3bce2f096e40" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.709244 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.794732 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9vbmb"] Oct 09 10:58:22 crc kubenswrapper[4740]: E1009 10:58:22.795402 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b62efe3-f320-4b06-9b4f-6cdebea2c83c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.795437 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b62efe3-f320-4b06-9b4f-6cdebea2c83c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.795902 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b62efe3-f320-4b06-9b4f-6cdebea2c83c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.797079 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.799451 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.799660 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.800293 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.801204 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.803291 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9vbmb"] Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.945054 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.945158 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxp8j\" (UniqueName: \"kubernetes.io/projected/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-kube-api-access-rxp8j\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:22 crc kubenswrapper[4740]: I1009 10:58:22.945232 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:23 crc kubenswrapper[4740]: I1009 10:58:23.047509 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:23 crc kubenswrapper[4740]: I1009 10:58:23.047588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxp8j\" (UniqueName: \"kubernetes.io/projected/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-kube-api-access-rxp8j\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:23 crc kubenswrapper[4740]: I1009 10:58:23.047636 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:23 crc kubenswrapper[4740]: I1009 10:58:23.052997 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:23 crc kubenswrapper[4740]: I1009 10:58:23.053894 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:23 crc kubenswrapper[4740]: I1009 10:58:23.078896 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxp8j\" (UniqueName: \"kubernetes.io/projected/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-kube-api-access-rxp8j\") pod \"ssh-known-hosts-edpm-deployment-9vbmb\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:23 crc kubenswrapper[4740]: I1009 10:58:23.129009 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:23 crc kubenswrapper[4740]: I1009 10:58:23.718639 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9vbmb"] Oct 09 10:58:23 crc kubenswrapper[4740]: W1009 10:58:23.720035 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cc7d46b_528d_415b_a1cf_34ea3e4483b5.slice/crio-75510b57af9578e253a4fbf2b8749be07f7c92f46ee3ed92e90534666dcd3f39 WatchSource:0}: Error finding container 75510b57af9578e253a4fbf2b8749be07f7c92f46ee3ed92e90534666dcd3f39: Status 404 returned error can't find the container with id 75510b57af9578e253a4fbf2b8749be07f7c92f46ee3ed92e90534666dcd3f39 Oct 09 10:58:24 crc kubenswrapper[4740]: I1009 10:58:24.732410 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" event={"ID":"0cc7d46b-528d-415b-a1cf-34ea3e4483b5","Type":"ContainerStarted","Data":"a72f70cba3bc1473324c53cef4bf69b5110bb9527daa8f7ed3996f50e8015101"} Oct 09 10:58:24 crc kubenswrapper[4740]: I1009 10:58:24.733525 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" event={"ID":"0cc7d46b-528d-415b-a1cf-34ea3e4483b5","Type":"ContainerStarted","Data":"75510b57af9578e253a4fbf2b8749be07f7c92f46ee3ed92e90534666dcd3f39"} Oct 09 10:58:24 crc kubenswrapper[4740]: I1009 10:58:24.749582 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" podStartSLOduration=2.330860593 podStartE2EDuration="2.749568941s" podCreationTimestamp="2025-10-09 10:58:22 +0000 UTC" firstStartedPulling="2025-10-09 10:58:23.722380142 +0000 UTC m=+1842.684580533" lastFinishedPulling="2025-10-09 10:58:24.14108846 +0000 UTC m=+1843.103288881" observedRunningTime="2025-10-09 10:58:24.747403522 +0000 UTC m=+1843.709603903" watchObservedRunningTime="2025-10-09 10:58:24.749568941 +0000 UTC m=+1843.711769322" Oct 09 10:58:31 crc kubenswrapper[4740]: I1009 10:58:31.805180 4740 generic.go:334] "Generic (PLEG): container finished" podID="0cc7d46b-528d-415b-a1cf-34ea3e4483b5" containerID="a72f70cba3bc1473324c53cef4bf69b5110bb9527daa8f7ed3996f50e8015101" exitCode=0 Oct 09 10:58:31 crc kubenswrapper[4740]: I1009 10:58:31.805285 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" event={"ID":"0cc7d46b-528d-415b-a1cf-34ea3e4483b5","Type":"ContainerDied","Data":"a72f70cba3bc1473324c53cef4bf69b5110bb9527daa8f7ed3996f50e8015101"} Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.299890 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.432523 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-ssh-key-openstack-edpm-ipam\") pod \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.432662 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-inventory-0\") pod \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.432720 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxp8j\" (UniqueName: \"kubernetes.io/projected/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-kube-api-access-rxp8j\") pod \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\" (UID: \"0cc7d46b-528d-415b-a1cf-34ea3e4483b5\") " Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.438475 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-kube-api-access-rxp8j" (OuterVolumeSpecName: "kube-api-access-rxp8j") pod "0cc7d46b-528d-415b-a1cf-34ea3e4483b5" (UID: "0cc7d46b-528d-415b-a1cf-34ea3e4483b5"). InnerVolumeSpecName "kube-api-access-rxp8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.459940 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0cc7d46b-528d-415b-a1cf-34ea3e4483b5" (UID: "0cc7d46b-528d-415b-a1cf-34ea3e4483b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.480473 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "0cc7d46b-528d-415b-a1cf-34ea3e4483b5" (UID: "0cc7d46b-528d-415b-a1cf-34ea3e4483b5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.535569 4740 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.535613 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxp8j\" (UniqueName: \"kubernetes.io/projected/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-kube-api-access-rxp8j\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.535630 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cc7d46b-528d-415b-a1cf-34ea3e4483b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.825825 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" event={"ID":"0cc7d46b-528d-415b-a1cf-34ea3e4483b5","Type":"ContainerDied","Data":"75510b57af9578e253a4fbf2b8749be07f7c92f46ee3ed92e90534666dcd3f39"} Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.825896 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75510b57af9578e253a4fbf2b8749be07f7c92f46ee3ed92e90534666dcd3f39" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.825977 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9vbmb" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.943181 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6"] Oct 09 10:58:33 crc kubenswrapper[4740]: E1009 10:58:33.943935 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc7d46b-528d-415b-a1cf-34ea3e4483b5" containerName="ssh-known-hosts-edpm-deployment" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.943965 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc7d46b-528d-415b-a1cf-34ea3e4483b5" containerName="ssh-known-hosts-edpm-deployment" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.944347 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc7d46b-528d-415b-a1cf-34ea3e4483b5" containerName="ssh-known-hosts-edpm-deployment" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.945967 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.951596 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.952112 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.952359 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.952583 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:58:33 crc kubenswrapper[4740]: I1009 10:58:33.964834 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6"] Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.048833 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpg8p\" (UniqueName: \"kubernetes.io/projected/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-kube-api-access-tpg8p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.049239 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.049413 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.150315 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpg8p\" (UniqueName: \"kubernetes.io/projected/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-kube-api-access-tpg8p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.150476 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.150539 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.155229 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.155798 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.172577 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpg8p\" (UniqueName: \"kubernetes.io/projected/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-kube-api-access-tpg8p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-w8hc6\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.279155 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.816166 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6"] Oct 09 10:58:34 crc kubenswrapper[4740]: W1009 10:58:34.817298 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc79d4035_1be0_44ff_9ddd_0a65a54be7ed.slice/crio-938e478fb53c45939c35e0e4a64743365293619d432f2c0eecad6f944c0314a0 WatchSource:0}: Error finding container 938e478fb53c45939c35e0e4a64743365293619d432f2c0eecad6f944c0314a0: Status 404 returned error can't find the container with id 938e478fb53c45939c35e0e4a64743365293619d432f2c0eecad6f944c0314a0 Oct 09 10:58:34 crc kubenswrapper[4740]: I1009 10:58:34.835334 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" event={"ID":"c79d4035-1be0-44ff-9ddd-0a65a54be7ed","Type":"ContainerStarted","Data":"938e478fb53c45939c35e0e4a64743365293619d432f2c0eecad6f944c0314a0"} Oct 09 10:58:35 crc kubenswrapper[4740]: I1009 10:58:35.844246 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" event={"ID":"c79d4035-1be0-44ff-9ddd-0a65a54be7ed","Type":"ContainerStarted","Data":"3e88877167d7d6f5f6a14db5468e6582eb8c4a1dc7aa08d153dda979f4297694"} Oct 09 10:58:35 crc kubenswrapper[4740]: I1009 10:58:35.869375 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" podStartSLOduration=2.435888982 podStartE2EDuration="2.869359706s" podCreationTimestamp="2025-10-09 10:58:33 +0000 UTC" firstStartedPulling="2025-10-09 10:58:34.820333417 +0000 UTC m=+1853.782533798" lastFinishedPulling="2025-10-09 10:58:35.253804141 +0000 UTC m=+1854.216004522" observedRunningTime="2025-10-09 10:58:35.862310523 +0000 UTC m=+1854.824510924" watchObservedRunningTime="2025-10-09 10:58:35.869359706 +0000 UTC m=+1854.831560087" Oct 09 10:58:43 crc kubenswrapper[4740]: I1009 10:58:43.920962 4740 generic.go:334] "Generic (PLEG): container finished" podID="c79d4035-1be0-44ff-9ddd-0a65a54be7ed" containerID="3e88877167d7d6f5f6a14db5468e6582eb8c4a1dc7aa08d153dda979f4297694" exitCode=0 Oct 09 10:58:43 crc kubenswrapper[4740]: I1009 10:58:43.921041 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" event={"ID":"c79d4035-1be0-44ff-9ddd-0a65a54be7ed","Type":"ContainerDied","Data":"3e88877167d7d6f5f6a14db5468e6582eb8c4a1dc7aa08d153dda979f4297694"} Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.400792 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.469795 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpg8p\" (UniqueName: \"kubernetes.io/projected/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-kube-api-access-tpg8p\") pod \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.470549 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-inventory\") pod \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.470860 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-ssh-key\") pod \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\" (UID: \"c79d4035-1be0-44ff-9ddd-0a65a54be7ed\") " Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.476235 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-kube-api-access-tpg8p" (OuterVolumeSpecName: "kube-api-access-tpg8p") pod "c79d4035-1be0-44ff-9ddd-0a65a54be7ed" (UID: "c79d4035-1be0-44ff-9ddd-0a65a54be7ed"). InnerVolumeSpecName "kube-api-access-tpg8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.497828 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-inventory" (OuterVolumeSpecName: "inventory") pod "c79d4035-1be0-44ff-9ddd-0a65a54be7ed" (UID: "c79d4035-1be0-44ff-9ddd-0a65a54be7ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.514506 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c79d4035-1be0-44ff-9ddd-0a65a54be7ed" (UID: "c79d4035-1be0-44ff-9ddd-0a65a54be7ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.575288 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpg8p\" (UniqueName: \"kubernetes.io/projected/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-kube-api-access-tpg8p\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.575333 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.575345 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c79d4035-1be0-44ff-9ddd-0a65a54be7ed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.945585 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" event={"ID":"c79d4035-1be0-44ff-9ddd-0a65a54be7ed","Type":"ContainerDied","Data":"938e478fb53c45939c35e0e4a64743365293619d432f2c0eecad6f944c0314a0"} Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.945639 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="938e478fb53c45939c35e0e4a64743365293619d432f2c0eecad6f944c0314a0" Oct 09 10:58:45 crc kubenswrapper[4740]: I1009 10:58:45.945700 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-w8hc6" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.027554 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg"] Oct 09 10:58:46 crc kubenswrapper[4740]: E1009 10:58:46.028089 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79d4035-1be0-44ff-9ddd-0a65a54be7ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.028117 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79d4035-1be0-44ff-9ddd-0a65a54be7ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.028387 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79d4035-1be0-44ff-9ddd-0a65a54be7ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.029534 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.031324 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.033033 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.033067 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.033071 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.045139 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg"] Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.074797 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-66569d88ff-tjljh" podUID="501b9024-4f9f-41eb-ae73-d9ecb0637363" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.087519 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.087637 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgl2t\" (UniqueName: \"kubernetes.io/projected/3f30a224-f5af-498e-97f3-28a5a26f9884-kube-api-access-bgl2t\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.087695 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.189335 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.189406 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgl2t\" (UniqueName: \"kubernetes.io/projected/3f30a224-f5af-498e-97f3-28a5a26f9884-kube-api-access-bgl2t\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.189438 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.194808 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.195070 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.208711 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgl2t\" (UniqueName: \"kubernetes.io/projected/3f30a224-f5af-498e-97f3-28a5a26f9884-kube-api-access-bgl2t\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.357063 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.918119 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg"] Oct 09 10:58:46 crc kubenswrapper[4740]: I1009 10:58:46.971665 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" event={"ID":"3f30a224-f5af-498e-97f3-28a5a26f9884","Type":"ContainerStarted","Data":"6e8a74ea5ab302f9ad21cfcc4ff7692a9bd1747f105afb681f9f750273d9cd78"} Oct 09 10:58:47 crc kubenswrapper[4740]: I1009 10:58:47.979248 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" event={"ID":"3f30a224-f5af-498e-97f3-28a5a26f9884","Type":"ContainerStarted","Data":"5fe79beb09e89fffd2994894794a8d7f12ee81b2764b65b5c7b37544fe4f5222"} Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.153421 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" podStartSLOduration=74.72835857 podStartE2EDuration="1m15.153397853s" podCreationTimestamp="2025-10-09 10:58:45 +0000 UTC" firstStartedPulling="2025-10-09 10:58:46.92380781 +0000 UTC m=+1865.886008211" lastFinishedPulling="2025-10-09 10:58:47.348847113 +0000 UTC m=+1866.311047494" observedRunningTime="2025-10-09 10:58:48.01445598 +0000 UTC m=+1866.976656361" watchObservedRunningTime="2025-10-09 11:00:00.153397853 +0000 UTC m=+1939.115598244" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.158725 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq"] Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.160261 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.168033 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.168480 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.177450 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq"] Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.272064 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5hs5\" (UniqueName: \"kubernetes.io/projected/bab3167b-83b6-4294-834f-39300ea36843-kube-api-access-b5hs5\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.273526 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bab3167b-83b6-4294-834f-39300ea36843-secret-volume\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.273675 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bab3167b-83b6-4294-834f-39300ea36843-config-volume\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.374939 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bab3167b-83b6-4294-834f-39300ea36843-secret-volume\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.374997 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bab3167b-83b6-4294-834f-39300ea36843-config-volume\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.375056 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5hs5\" (UniqueName: \"kubernetes.io/projected/bab3167b-83b6-4294-834f-39300ea36843-kube-api-access-b5hs5\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.376267 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bab3167b-83b6-4294-834f-39300ea36843-config-volume\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.387503 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bab3167b-83b6-4294-834f-39300ea36843-secret-volume\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.397586 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5hs5\" (UniqueName: \"kubernetes.io/projected/bab3167b-83b6-4294-834f-39300ea36843-kube-api-access-b5hs5\") pod \"collect-profiles-29333460-nlgwq\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.481668 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:00 crc kubenswrapper[4740]: I1009 11:00:00.935917 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq"] Oct 09 11:00:01 crc kubenswrapper[4740]: I1009 11:00:01.714731 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" event={"ID":"bab3167b-83b6-4294-834f-39300ea36843","Type":"ContainerStarted","Data":"3277536e815b326712d914615ac790d704340e1fa354d0ee4ed961fe0e72970f"} Oct 09 11:00:01 crc kubenswrapper[4740]: I1009 11:00:01.715138 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" event={"ID":"bab3167b-83b6-4294-834f-39300ea36843","Type":"ContainerStarted","Data":"9b27cadc4887ea99fe68b2f69f0e8fb58601211ad0a1c843fab786c56d304931"} Oct 09 11:00:05 crc kubenswrapper[4740]: I1009 11:00:05.764027 4740 generic.go:334] "Generic (PLEG): container finished" podID="bab3167b-83b6-4294-834f-39300ea36843" containerID="3277536e815b326712d914615ac790d704340e1fa354d0ee4ed961fe0e72970f" exitCode=0 Oct 09 11:00:05 crc kubenswrapper[4740]: I1009 11:00:05.765782 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" event={"ID":"bab3167b-83b6-4294-834f-39300ea36843","Type":"ContainerDied","Data":"3277536e815b326712d914615ac790d704340e1fa354d0ee4ed961fe0e72970f"} Oct 09 11:00:05 crc kubenswrapper[4740]: I1009 11:00:05.771849 4740 generic.go:334] "Generic (PLEG): container finished" podID="3f30a224-f5af-498e-97f3-28a5a26f9884" containerID="5fe79beb09e89fffd2994894794a8d7f12ee81b2764b65b5c7b37544fe4f5222" exitCode=0 Oct 09 11:00:05 crc kubenswrapper[4740]: I1009 11:00:05.772057 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" event={"ID":"3f30a224-f5af-498e-97f3-28a5a26f9884","Type":"ContainerDied","Data":"5fe79beb09e89fffd2994894794a8d7f12ee81b2764b65b5c7b37544fe4f5222"} Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.245722 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.253451 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.400450 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-inventory\") pod \"3f30a224-f5af-498e-97f3-28a5a26f9884\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.400523 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bab3167b-83b6-4294-834f-39300ea36843-config-volume\") pod \"bab3167b-83b6-4294-834f-39300ea36843\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.400552 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgl2t\" (UniqueName: \"kubernetes.io/projected/3f30a224-f5af-498e-97f3-28a5a26f9884-kube-api-access-bgl2t\") pod \"3f30a224-f5af-498e-97f3-28a5a26f9884\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.400597 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-ssh-key\") pod \"3f30a224-f5af-498e-97f3-28a5a26f9884\" (UID: \"3f30a224-f5af-498e-97f3-28a5a26f9884\") " Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.400810 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5hs5\" (UniqueName: \"kubernetes.io/projected/bab3167b-83b6-4294-834f-39300ea36843-kube-api-access-b5hs5\") pod \"bab3167b-83b6-4294-834f-39300ea36843\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.400859 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bab3167b-83b6-4294-834f-39300ea36843-secret-volume\") pod \"bab3167b-83b6-4294-834f-39300ea36843\" (UID: \"bab3167b-83b6-4294-834f-39300ea36843\") " Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.401747 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bab3167b-83b6-4294-834f-39300ea36843-config-volume" (OuterVolumeSpecName: "config-volume") pod "bab3167b-83b6-4294-834f-39300ea36843" (UID: "bab3167b-83b6-4294-834f-39300ea36843"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.408099 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab3167b-83b6-4294-834f-39300ea36843-kube-api-access-b5hs5" (OuterVolumeSpecName: "kube-api-access-b5hs5") pod "bab3167b-83b6-4294-834f-39300ea36843" (UID: "bab3167b-83b6-4294-834f-39300ea36843"). InnerVolumeSpecName "kube-api-access-b5hs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.412847 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab3167b-83b6-4294-834f-39300ea36843-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bab3167b-83b6-4294-834f-39300ea36843" (UID: "bab3167b-83b6-4294-834f-39300ea36843"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.413048 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f30a224-f5af-498e-97f3-28a5a26f9884-kube-api-access-bgl2t" (OuterVolumeSpecName: "kube-api-access-bgl2t") pod "3f30a224-f5af-498e-97f3-28a5a26f9884" (UID: "3f30a224-f5af-498e-97f3-28a5a26f9884"). InnerVolumeSpecName "kube-api-access-bgl2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.445310 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-inventory" (OuterVolumeSpecName: "inventory") pod "3f30a224-f5af-498e-97f3-28a5a26f9884" (UID: "3f30a224-f5af-498e-97f3-28a5a26f9884"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.451007 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3f30a224-f5af-498e-97f3-28a5a26f9884" (UID: "3f30a224-f5af-498e-97f3-28a5a26f9884"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.502573 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.502599 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5hs5\" (UniqueName: \"kubernetes.io/projected/bab3167b-83b6-4294-834f-39300ea36843-kube-api-access-b5hs5\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.502610 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bab3167b-83b6-4294-834f-39300ea36843-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.502618 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f30a224-f5af-498e-97f3-28a5a26f9884-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.502627 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bab3167b-83b6-4294-834f-39300ea36843-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.502636 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgl2t\" (UniqueName: \"kubernetes.io/projected/3f30a224-f5af-498e-97f3-28a5a26f9884-kube-api-access-bgl2t\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.805087 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.805120 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg" event={"ID":"3f30a224-f5af-498e-97f3-28a5a26f9884","Type":"ContainerDied","Data":"6e8a74ea5ab302f9ad21cfcc4ff7692a9bd1747f105afb681f9f750273d9cd78"} Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.805440 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e8a74ea5ab302f9ad21cfcc4ff7692a9bd1747f105afb681f9f750273d9cd78" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.808260 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" event={"ID":"bab3167b-83b6-4294-834f-39300ea36843","Type":"ContainerDied","Data":"9b27cadc4887ea99fe68b2f69f0e8fb58601211ad0a1c843fab786c56d304931"} Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.808294 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b27cadc4887ea99fe68b2f69f0e8fb58601211ad0a1c843fab786c56d304931" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.808382 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333460-nlgwq" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.897366 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn"] Oct 09 11:00:07 crc kubenswrapper[4740]: E1009 11:00:07.898003 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f30a224-f5af-498e-97f3-28a5a26f9884" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.898022 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f30a224-f5af-498e-97f3-28a5a26f9884" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 11:00:07 crc kubenswrapper[4740]: E1009 11:00:07.898041 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab3167b-83b6-4294-834f-39300ea36843" containerName="collect-profiles" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.898048 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab3167b-83b6-4294-834f-39300ea36843" containerName="collect-profiles" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.898224 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab3167b-83b6-4294-834f-39300ea36843" containerName="collect-profiles" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.898235 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f30a224-f5af-498e-97f3-28a5a26f9884" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.898865 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.903346 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.903553 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.903565 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.904998 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.905490 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.906543 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.907337 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.909056 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 11:00:07 crc kubenswrapper[4740]: I1009 11:00:07.912410 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn"] Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.012527 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.012661 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.012707 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.012738 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.012853 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skr5x\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-kube-api-access-skr5x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.012909 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.012945 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.013081 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.013222 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.013298 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.013351 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.013518 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.013590 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.013661 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.115719 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.116030 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.116110 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.116241 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skr5x\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-kube-api-access-skr5x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.116332 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.116421 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.116543 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.116965 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.117068 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.117181 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.117285 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.117389 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.117467 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.117550 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.120189 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.120200 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.120959 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.121648 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.121675 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.122385 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.122451 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.123291 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.123583 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.123590 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.124435 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.124811 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.129027 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.135259 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skr5x\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-kube-api-access-skr5x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h6knn\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.215155 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.770662 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn"] Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.777951 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 11:00:08 crc kubenswrapper[4740]: I1009 11:00:08.819343 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" event={"ID":"d611b217-c3b5-49dd-9a5f-acd64171310d","Type":"ContainerStarted","Data":"89b181fd9af1b04e8ecaf8735eaf39cc7dd175e9abfb02069333044624d756ae"} Oct 09 11:00:10 crc kubenswrapper[4740]: I1009 11:00:10.835363 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" event={"ID":"d611b217-c3b5-49dd-9a5f-acd64171310d","Type":"ContainerStarted","Data":"8667c14d8ed903deb29162ed719402c5fa7db664d078df3df476c32dbd2d9ebf"} Oct 09 11:00:10 crc kubenswrapper[4740]: I1009 11:00:10.860935 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" podStartSLOduration=2.518132574 podStartE2EDuration="3.860919628s" podCreationTimestamp="2025-10-09 11:00:07 +0000 UTC" firstStartedPulling="2025-10-09 11:00:08.77753672 +0000 UTC m=+1947.739737111" lastFinishedPulling="2025-10-09 11:00:10.120323784 +0000 UTC m=+1949.082524165" observedRunningTime="2025-10-09 11:00:10.857732821 +0000 UTC m=+1949.819933212" watchObservedRunningTime="2025-10-09 11:00:10.860919628 +0000 UTC m=+1949.823120009" Oct 09 11:00:35 crc kubenswrapper[4740]: I1009 11:00:35.408954 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:00:35 crc kubenswrapper[4740]: I1009 11:00:35.409514 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:00:49 crc kubenswrapper[4740]: I1009 11:00:49.253666 4740 generic.go:334] "Generic (PLEG): container finished" podID="d611b217-c3b5-49dd-9a5f-acd64171310d" containerID="8667c14d8ed903deb29162ed719402c5fa7db664d078df3df476c32dbd2d9ebf" exitCode=0 Oct 09 11:00:49 crc kubenswrapper[4740]: I1009 11:00:49.253813 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" event={"ID":"d611b217-c3b5-49dd-9a5f-acd64171310d","Type":"ContainerDied","Data":"8667c14d8ed903deb29162ed719402c5fa7db664d078df3df476c32dbd2d9ebf"} Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.648693 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794354 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794425 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-neutron-metadata-combined-ca-bundle\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794457 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-libvirt-combined-ca-bundle\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794549 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ovn-combined-ca-bundle\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794583 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-inventory\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794627 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794667 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-repo-setup-combined-ca-bundle\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794731 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-bootstrap-combined-ca-bundle\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794834 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ssh-key\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794885 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-telemetry-combined-ca-bundle\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794934 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794958 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skr5x\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-kube-api-access-skr5x\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.794999 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-nova-combined-ca-bundle\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.796244 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d611b217-c3b5-49dd-9a5f-acd64171310d\" (UID: \"d611b217-c3b5-49dd-9a5f-acd64171310d\") " Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.802467 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-kube-api-access-skr5x" (OuterVolumeSpecName: "kube-api-access-skr5x") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "kube-api-access-skr5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.804041 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.804486 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.804601 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.804941 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.805654 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.805723 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.805970 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.806051 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.806938 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.809731 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.813922 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.835387 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.849312 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-inventory" (OuterVolumeSpecName: "inventory") pod "d611b217-c3b5-49dd-9a5f-acd64171310d" (UID: "d611b217-c3b5-49dd-9a5f-acd64171310d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898851 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898893 4740 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898907 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898918 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skr5x\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-kube-api-access-skr5x\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898931 4740 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898943 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898955 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898969 4740 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898983 4740 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.898994 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.899006 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.899018 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d611b217-c3b5-49dd-9a5f-acd64171310d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.899030 4740 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:50 crc kubenswrapper[4740]: I1009 11:00:50.899043 4740 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d611b217-c3b5-49dd-9a5f-acd64171310d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.272975 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" event={"ID":"d611b217-c3b5-49dd-9a5f-acd64171310d","Type":"ContainerDied","Data":"89b181fd9af1b04e8ecaf8735eaf39cc7dd175e9abfb02069333044624d756ae"} Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.273022 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b181fd9af1b04e8ecaf8735eaf39cc7dd175e9abfb02069333044624d756ae" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.273043 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h6knn" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.364848 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5"] Oct 09 11:00:51 crc kubenswrapper[4740]: E1009 11:00:51.365265 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d611b217-c3b5-49dd-9a5f-acd64171310d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.365285 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d611b217-c3b5-49dd-9a5f-acd64171310d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.365445 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d611b217-c3b5-49dd-9a5f-acd64171310d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.366092 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.369343 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.369353 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.369657 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.369941 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.370912 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.383999 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5"] Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.513527 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/72061500-62b1-404d-8def-280fcca2e73f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.513619 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.513660 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.513722 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.513941 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nd4\" (UniqueName: \"kubernetes.io/projected/72061500-62b1-404d-8def-280fcca2e73f-kube-api-access-g5nd4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.615867 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/72061500-62b1-404d-8def-280fcca2e73f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.615937 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.615970 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.616012 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.616065 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nd4\" (UniqueName: \"kubernetes.io/projected/72061500-62b1-404d-8def-280fcca2e73f-kube-api-access-g5nd4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.616769 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/72061500-62b1-404d-8def-280fcca2e73f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.620653 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.620858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.621888 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.631413 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nd4\" (UniqueName: \"kubernetes.io/projected/72061500-62b1-404d-8def-280fcca2e73f-kube-api-access-g5nd4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hd9p5\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:51 crc kubenswrapper[4740]: I1009 11:00:51.705289 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:00:52 crc kubenswrapper[4740]: I1009 11:00:52.246075 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5"] Oct 09 11:00:52 crc kubenswrapper[4740]: I1009 11:00:52.281346 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" event={"ID":"72061500-62b1-404d-8def-280fcca2e73f","Type":"ContainerStarted","Data":"94353f17ffbd037d561c6d4c809b50275be62882b6f80050012f46c61e557677"} Oct 09 11:00:54 crc kubenswrapper[4740]: I1009 11:00:54.309415 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" event={"ID":"72061500-62b1-404d-8def-280fcca2e73f","Type":"ContainerStarted","Data":"e24ad72d5ef03f04171553a0e97035d9fb68b1e620f5e6c1edce59cfa81b39b5"} Oct 09 11:00:54 crc kubenswrapper[4740]: I1009 11:00:54.337309 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" podStartSLOduration=2.362268699 podStartE2EDuration="3.337287949s" podCreationTimestamp="2025-10-09 11:00:51 +0000 UTC" firstStartedPulling="2025-10-09 11:00:52.250857599 +0000 UTC m=+1991.213057980" lastFinishedPulling="2025-10-09 11:00:53.225876849 +0000 UTC m=+1992.188077230" observedRunningTime="2025-10-09 11:00:54.326892165 +0000 UTC m=+1993.289092556" watchObservedRunningTime="2025-10-09 11:00:54.337287949 +0000 UTC m=+1993.299488340" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.164691 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29333461-s8498"] Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.166646 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.185355 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29333461-s8498"] Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.276055 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-fernet-keys\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.276102 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-combined-ca-bundle\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.276121 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stk87\" (UniqueName: \"kubernetes.io/projected/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-kube-api-access-stk87\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.276164 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-config-data\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.378237 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-config-data\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.378487 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-fernet-keys\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.378526 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-combined-ca-bundle\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.378552 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stk87\" (UniqueName: \"kubernetes.io/projected/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-kube-api-access-stk87\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.387451 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-combined-ca-bundle\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.388006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-fernet-keys\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.388819 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-config-data\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.405062 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stk87\" (UniqueName: \"kubernetes.io/projected/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-kube-api-access-stk87\") pod \"keystone-cron-29333461-s8498\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.502110 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:00 crc kubenswrapper[4740]: I1009 11:01:00.953736 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29333461-s8498"] Oct 09 11:01:00 crc kubenswrapper[4740]: W1009 11:01:00.966515 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56c17d1b_f3e0_4ca7_ad1f_ac1314036f59.slice/crio-85c534e0894a4c46ab92bb5cacec06c2e581b388e0ebc3523e5429128cbd7dc6 WatchSource:0}: Error finding container 85c534e0894a4c46ab92bb5cacec06c2e581b388e0ebc3523e5429128cbd7dc6: Status 404 returned error can't find the container with id 85c534e0894a4c46ab92bb5cacec06c2e581b388e0ebc3523e5429128cbd7dc6 Oct 09 11:01:01 crc kubenswrapper[4740]: I1009 11:01:01.379795 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333461-s8498" event={"ID":"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59","Type":"ContainerStarted","Data":"fa2c0febca4614dbdcebf3cf0c85c0a26686d83e34b43e3fecfd56d72ea91d05"} Oct 09 11:01:01 crc kubenswrapper[4740]: I1009 11:01:01.380249 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333461-s8498" event={"ID":"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59","Type":"ContainerStarted","Data":"85c534e0894a4c46ab92bb5cacec06c2e581b388e0ebc3523e5429128cbd7dc6"} Oct 09 11:01:01 crc kubenswrapper[4740]: I1009 11:01:01.400433 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29333461-s8498" podStartSLOduration=1.400410058 podStartE2EDuration="1.400410058s" podCreationTimestamp="2025-10-09 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 11:01:01.393817788 +0000 UTC m=+2000.356018169" watchObservedRunningTime="2025-10-09 11:01:01.400410058 +0000 UTC m=+2000.362610439" Oct 09 11:01:03 crc kubenswrapper[4740]: I1009 11:01:03.408356 4740 generic.go:334] "Generic (PLEG): container finished" podID="56c17d1b-f3e0-4ca7-ad1f-ac1314036f59" containerID="fa2c0febca4614dbdcebf3cf0c85c0a26686d83e34b43e3fecfd56d72ea91d05" exitCode=0 Oct 09 11:01:03 crc kubenswrapper[4740]: I1009 11:01:03.408466 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333461-s8498" event={"ID":"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59","Type":"ContainerDied","Data":"fa2c0febca4614dbdcebf3cf0c85c0a26686d83e34b43e3fecfd56d72ea91d05"} Oct 09 11:01:04 crc kubenswrapper[4740]: I1009 11:01:04.853009 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:04 crc kubenswrapper[4740]: I1009 11:01:04.986352 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-config-data\") pod \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " Oct 09 11:01:04 crc kubenswrapper[4740]: I1009 11:01:04.986814 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-combined-ca-bundle\") pod \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " Oct 09 11:01:04 crc kubenswrapper[4740]: I1009 11:01:04.986868 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stk87\" (UniqueName: \"kubernetes.io/projected/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-kube-api-access-stk87\") pod \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " Oct 09 11:01:04 crc kubenswrapper[4740]: I1009 11:01:04.986901 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-fernet-keys\") pod \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\" (UID: \"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59\") " Oct 09 11:01:04 crc kubenswrapper[4740]: I1009 11:01:04.991018 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-kube-api-access-stk87" (OuterVolumeSpecName: "kube-api-access-stk87") pod "56c17d1b-f3e0-4ca7-ad1f-ac1314036f59" (UID: "56c17d1b-f3e0-4ca7-ad1f-ac1314036f59"). InnerVolumeSpecName "kube-api-access-stk87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.010023 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "56c17d1b-f3e0-4ca7-ad1f-ac1314036f59" (UID: "56c17d1b-f3e0-4ca7-ad1f-ac1314036f59"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.027678 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56c17d1b-f3e0-4ca7-ad1f-ac1314036f59" (UID: "56c17d1b-f3e0-4ca7-ad1f-ac1314036f59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.047846 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-config-data" (OuterVolumeSpecName: "config-data") pod "56c17d1b-f3e0-4ca7-ad1f-ac1314036f59" (UID: "56c17d1b-f3e0-4ca7-ad1f-ac1314036f59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.088685 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.088718 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.088731 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stk87\" (UniqueName: \"kubernetes.io/projected/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-kube-api-access-stk87\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.088740 4740 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c17d1b-f3e0-4ca7-ad1f-ac1314036f59-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.408291 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.408365 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.426321 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333461-s8498" event={"ID":"56c17d1b-f3e0-4ca7-ad1f-ac1314036f59","Type":"ContainerDied","Data":"85c534e0894a4c46ab92bb5cacec06c2e581b388e0ebc3523e5429128cbd7dc6"} Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.426366 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c534e0894a4c46ab92bb5cacec06c2e581b388e0ebc3523e5429128cbd7dc6" Oct 09 11:01:05 crc kubenswrapper[4740]: I1009 11:01:05.426421 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333461-s8498" Oct 09 11:01:35 crc kubenswrapper[4740]: I1009 11:01:35.408197 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:01:35 crc kubenswrapper[4740]: I1009 11:01:35.408730 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:01:35 crc kubenswrapper[4740]: I1009 11:01:35.408786 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 11:01:35 crc kubenswrapper[4740]: I1009 11:01:35.409452 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3448287b6cd68c3403bb27caa7100e27359be5b949fa1d87e08098aaeba8b363"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 11:01:35 crc kubenswrapper[4740]: I1009 11:01:35.409501 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://3448287b6cd68c3403bb27caa7100e27359be5b949fa1d87e08098aaeba8b363" gracePeriod=600 Oct 09 11:01:35 crc kubenswrapper[4740]: I1009 11:01:35.679591 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="3448287b6cd68c3403bb27caa7100e27359be5b949fa1d87e08098aaeba8b363" exitCode=0 Oct 09 11:01:35 crc kubenswrapper[4740]: I1009 11:01:35.679649 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"3448287b6cd68c3403bb27caa7100e27359be5b949fa1d87e08098aaeba8b363"} Oct 09 11:01:35 crc kubenswrapper[4740]: I1009 11:01:35.679998 4740 scope.go:117] "RemoveContainer" containerID="b6abc420b2de21b6ad72277790f87e5c6dd5fe0927fed71c2087aee093f42562" Oct 09 11:01:36 crc kubenswrapper[4740]: I1009 11:01:36.701611 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689"} Oct 09 11:01:54 crc kubenswrapper[4740]: I1009 11:01:54.871556 4740 generic.go:334] "Generic (PLEG): container finished" podID="72061500-62b1-404d-8def-280fcca2e73f" containerID="e24ad72d5ef03f04171553a0e97035d9fb68b1e620f5e6c1edce59cfa81b39b5" exitCode=0 Oct 09 11:01:54 crc kubenswrapper[4740]: I1009 11:01:54.871724 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" event={"ID":"72061500-62b1-404d-8def-280fcca2e73f","Type":"ContainerDied","Data":"e24ad72d5ef03f04171553a0e97035d9fb68b1e620f5e6c1edce59cfa81b39b5"} Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.315928 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.430484 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5nd4\" (UniqueName: \"kubernetes.io/projected/72061500-62b1-404d-8def-280fcca2e73f-kube-api-access-g5nd4\") pod \"72061500-62b1-404d-8def-280fcca2e73f\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.430593 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ssh-key\") pod \"72061500-62b1-404d-8def-280fcca2e73f\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.430645 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-inventory\") pod \"72061500-62b1-404d-8def-280fcca2e73f\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.430668 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/72061500-62b1-404d-8def-280fcca2e73f-ovncontroller-config-0\") pod \"72061500-62b1-404d-8def-280fcca2e73f\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.430690 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ovn-combined-ca-bundle\") pod \"72061500-62b1-404d-8def-280fcca2e73f\" (UID: \"72061500-62b1-404d-8def-280fcca2e73f\") " Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.436597 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "72061500-62b1-404d-8def-280fcca2e73f" (UID: "72061500-62b1-404d-8def-280fcca2e73f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.436646 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72061500-62b1-404d-8def-280fcca2e73f-kube-api-access-g5nd4" (OuterVolumeSpecName: "kube-api-access-g5nd4") pod "72061500-62b1-404d-8def-280fcca2e73f" (UID: "72061500-62b1-404d-8def-280fcca2e73f"). InnerVolumeSpecName "kube-api-access-g5nd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.459940 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72061500-62b1-404d-8def-280fcca2e73f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "72061500-62b1-404d-8def-280fcca2e73f" (UID: "72061500-62b1-404d-8def-280fcca2e73f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.462850 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "72061500-62b1-404d-8def-280fcca2e73f" (UID: "72061500-62b1-404d-8def-280fcca2e73f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.464943 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-inventory" (OuterVolumeSpecName: "inventory") pod "72061500-62b1-404d-8def-280fcca2e73f" (UID: "72061500-62b1-404d-8def-280fcca2e73f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.533481 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5nd4\" (UniqueName: \"kubernetes.io/projected/72061500-62b1-404d-8def-280fcca2e73f-kube-api-access-g5nd4\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.533515 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.533524 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.533546 4740 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/72061500-62b1-404d-8def-280fcca2e73f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.533556 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72061500-62b1-404d-8def-280fcca2e73f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.891919 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" event={"ID":"72061500-62b1-404d-8def-280fcca2e73f","Type":"ContainerDied","Data":"94353f17ffbd037d561c6d4c809b50275be62882b6f80050012f46c61e557677"} Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.891992 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94353f17ffbd037d561c6d4c809b50275be62882b6f80050012f46c61e557677" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.892078 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hd9p5" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.989519 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj"] Oct 09 11:01:56 crc kubenswrapper[4740]: E1009 11:01:56.989869 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c17d1b-f3e0-4ca7-ad1f-ac1314036f59" containerName="keystone-cron" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.989885 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c17d1b-f3e0-4ca7-ad1f-ac1314036f59" containerName="keystone-cron" Oct 09 11:01:56 crc kubenswrapper[4740]: E1009 11:01:56.989916 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72061500-62b1-404d-8def-280fcca2e73f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.989923 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="72061500-62b1-404d-8def-280fcca2e73f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.990090 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c17d1b-f3e0-4ca7-ad1f-ac1314036f59" containerName="keystone-cron" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.990121 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="72061500-62b1-404d-8def-280fcca2e73f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.990673 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.992480 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.992726 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.992905 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.993039 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.993066 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 11:01:56 crc kubenswrapper[4740]: I1009 11:01:56.993525 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.016337 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj"] Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.044297 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwr57\" (UniqueName: \"kubernetes.io/projected/cdb17de8-f861-4899-8e4d-455cd554cf43-kube-api-access-jwr57\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.044388 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.044422 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.044520 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.044559 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.044604 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.146142 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwr57\" (UniqueName: \"kubernetes.io/projected/cdb17de8-f861-4899-8e4d-455cd554cf43-kube-api-access-jwr57\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.146251 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.146299 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.146415 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.146698 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.146783 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.150194 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.150331 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.151520 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.155839 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.157078 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.164690 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwr57\" (UniqueName: \"kubernetes.io/projected/cdb17de8-f861-4899-8e4d-455cd554cf43-kube-api-access-jwr57\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.306190 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.824044 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj"] Oct 09 11:01:57 crc kubenswrapper[4740]: I1009 11:01:57.905460 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" event={"ID":"cdb17de8-f861-4899-8e4d-455cd554cf43","Type":"ContainerStarted","Data":"38b936a209c6a36ad071f356cd86f61a92a25365ce07f22db52510ae3d799b51"} Oct 09 11:01:58 crc kubenswrapper[4740]: I1009 11:01:58.914134 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" event={"ID":"cdb17de8-f861-4899-8e4d-455cd554cf43","Type":"ContainerStarted","Data":"704cd8b27369e455de86884f93f12bca82b99d9e44f131aaf94b9033eeabf7d3"} Oct 09 11:01:58 crc kubenswrapper[4740]: I1009 11:01:58.934251 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" podStartSLOduration=2.466044572 podStartE2EDuration="2.934230114s" podCreationTimestamp="2025-10-09 11:01:56 +0000 UTC" firstStartedPulling="2025-10-09 11:01:57.822878686 +0000 UTC m=+2056.785079077" lastFinishedPulling="2025-10-09 11:01:58.291064238 +0000 UTC m=+2057.253264619" observedRunningTime="2025-10-09 11:01:58.928198769 +0000 UTC m=+2057.890399160" watchObservedRunningTime="2025-10-09 11:01:58.934230114 +0000 UTC m=+2057.896430495" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.335263 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2zgh"] Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.337488 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.355198 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2zgh"] Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.355774 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sbgf\" (UniqueName: \"kubernetes.io/projected/c1708acf-572a-429e-85a6-210c18fdcaa1-kube-api-access-5sbgf\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.355817 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-catalog-content\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.355897 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-utilities\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.458381 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-utilities\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.458586 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sbgf\" (UniqueName: \"kubernetes.io/projected/c1708acf-572a-429e-85a6-210c18fdcaa1-kube-api-access-5sbgf\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.458614 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-catalog-content\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.459010 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-utilities\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.459043 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-catalog-content\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.484300 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sbgf\" (UniqueName: \"kubernetes.io/projected/c1708acf-572a-429e-85a6-210c18fdcaa1-kube-api-access-5sbgf\") pod \"redhat-marketplace-x2zgh\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:08 crc kubenswrapper[4740]: I1009 11:02:08.659959 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:09 crc kubenswrapper[4740]: I1009 11:02:09.161199 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2zgh"] Oct 09 11:02:09 crc kubenswrapper[4740]: W1009 11:02:09.172978 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1708acf_572a_429e_85a6_210c18fdcaa1.slice/crio-45a559690663b03659db2e1364b2fb741d43bf114a5f8831709bc468b0743abe WatchSource:0}: Error finding container 45a559690663b03659db2e1364b2fb741d43bf114a5f8831709bc468b0743abe: Status 404 returned error can't find the container with id 45a559690663b03659db2e1364b2fb741d43bf114a5f8831709bc468b0743abe Oct 09 11:02:10 crc kubenswrapper[4740]: I1009 11:02:10.004644 4740 generic.go:334] "Generic (PLEG): container finished" podID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerID="86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61" exitCode=0 Oct 09 11:02:10 crc kubenswrapper[4740]: I1009 11:02:10.004699 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2zgh" event={"ID":"c1708acf-572a-429e-85a6-210c18fdcaa1","Type":"ContainerDied","Data":"86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61"} Oct 09 11:02:10 crc kubenswrapper[4740]: I1009 11:02:10.006903 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2zgh" event={"ID":"c1708acf-572a-429e-85a6-210c18fdcaa1","Type":"ContainerStarted","Data":"45a559690663b03659db2e1364b2fb741d43bf114a5f8831709bc468b0743abe"} Oct 09 11:02:12 crc kubenswrapper[4740]: I1009 11:02:12.025277 4740 generic.go:334] "Generic (PLEG): container finished" podID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerID="1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b" exitCode=0 Oct 09 11:02:12 crc kubenswrapper[4740]: I1009 11:02:12.025382 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2zgh" event={"ID":"c1708acf-572a-429e-85a6-210c18fdcaa1","Type":"ContainerDied","Data":"1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b"} Oct 09 11:02:13 crc kubenswrapper[4740]: I1009 11:02:13.035763 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2zgh" event={"ID":"c1708acf-572a-429e-85a6-210c18fdcaa1","Type":"ContainerStarted","Data":"a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94"} Oct 09 11:02:13 crc kubenswrapper[4740]: I1009 11:02:13.056601 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2zgh" podStartSLOduration=2.600764079 podStartE2EDuration="5.056581897s" podCreationTimestamp="2025-10-09 11:02:08 +0000 UTC" firstStartedPulling="2025-10-09 11:02:10.007866225 +0000 UTC m=+2068.970066606" lastFinishedPulling="2025-10-09 11:02:12.463684043 +0000 UTC m=+2071.425884424" observedRunningTime="2025-10-09 11:02:13.049864614 +0000 UTC m=+2072.012065025" watchObservedRunningTime="2025-10-09 11:02:13.056581897 +0000 UTC m=+2072.018782278" Oct 09 11:02:18 crc kubenswrapper[4740]: I1009 11:02:18.661174 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:18 crc kubenswrapper[4740]: I1009 11:02:18.661506 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:18 crc kubenswrapper[4740]: I1009 11:02:18.729051 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:19 crc kubenswrapper[4740]: I1009 11:02:19.143793 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:19 crc kubenswrapper[4740]: I1009 11:02:19.209559 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2zgh"] Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.107230 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2zgh" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerName="registry-server" containerID="cri-o://a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94" gracePeriod=2 Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.658093 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.796884 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-utilities\") pod \"c1708acf-572a-429e-85a6-210c18fdcaa1\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.796974 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sbgf\" (UniqueName: \"kubernetes.io/projected/c1708acf-572a-429e-85a6-210c18fdcaa1-kube-api-access-5sbgf\") pod \"c1708acf-572a-429e-85a6-210c18fdcaa1\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.797164 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-catalog-content\") pod \"c1708acf-572a-429e-85a6-210c18fdcaa1\" (UID: \"c1708acf-572a-429e-85a6-210c18fdcaa1\") " Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.798500 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-utilities" (OuterVolumeSpecName: "utilities") pod "c1708acf-572a-429e-85a6-210c18fdcaa1" (UID: "c1708acf-572a-429e-85a6-210c18fdcaa1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.805418 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1708acf-572a-429e-85a6-210c18fdcaa1-kube-api-access-5sbgf" (OuterVolumeSpecName: "kube-api-access-5sbgf") pod "c1708acf-572a-429e-85a6-210c18fdcaa1" (UID: "c1708acf-572a-429e-85a6-210c18fdcaa1"). InnerVolumeSpecName "kube-api-access-5sbgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.811406 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1708acf-572a-429e-85a6-210c18fdcaa1" (UID: "c1708acf-572a-429e-85a6-210c18fdcaa1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.899776 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.899811 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1708acf-572a-429e-85a6-210c18fdcaa1-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:21 crc kubenswrapper[4740]: I1009 11:02:21.899820 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sbgf\" (UniqueName: \"kubernetes.io/projected/c1708acf-572a-429e-85a6-210c18fdcaa1-kube-api-access-5sbgf\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.118149 4740 generic.go:334] "Generic (PLEG): container finished" podID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerID="a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94" exitCode=0 Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.118202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2zgh" event={"ID":"c1708acf-572a-429e-85a6-210c18fdcaa1","Type":"ContainerDied","Data":"a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94"} Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.118228 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2zgh" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.118255 4740 scope.go:117] "RemoveContainer" containerID="a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.118243 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2zgh" event={"ID":"c1708acf-572a-429e-85a6-210c18fdcaa1","Type":"ContainerDied","Data":"45a559690663b03659db2e1364b2fb741d43bf114a5f8831709bc468b0743abe"} Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.155489 4740 scope.go:117] "RemoveContainer" containerID="1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.158501 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2zgh"] Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.164974 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2zgh"] Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.180415 4740 scope.go:117] "RemoveContainer" containerID="86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.228848 4740 scope.go:117] "RemoveContainer" containerID="a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94" Oct 09 11:02:22 crc kubenswrapper[4740]: E1009 11:02:22.229319 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94\": container with ID starting with a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94 not found: ID does not exist" containerID="a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.229367 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94"} err="failed to get container status \"a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94\": rpc error: code = NotFound desc = could not find container \"a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94\": container with ID starting with a936873ee026d4c1b18e9e382c74ae59af20c18146332b3b7a02419125693f94 not found: ID does not exist" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.229399 4740 scope.go:117] "RemoveContainer" containerID="1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b" Oct 09 11:02:22 crc kubenswrapper[4740]: E1009 11:02:22.229705 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b\": container with ID starting with 1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b not found: ID does not exist" containerID="1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.229738 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b"} err="failed to get container status \"1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b\": rpc error: code = NotFound desc = could not find container \"1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b\": container with ID starting with 1fca3f664ca00677226edb8f6271f810923146281fe945f67f9dbba7d878ec7b not found: ID does not exist" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.229771 4740 scope.go:117] "RemoveContainer" containerID="86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61" Oct 09 11:02:22 crc kubenswrapper[4740]: E1009 11:02:22.230235 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61\": container with ID starting with 86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61 not found: ID does not exist" containerID="86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61" Oct 09 11:02:22 crc kubenswrapper[4740]: I1009 11:02:22.230260 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61"} err="failed to get container status \"86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61\": rpc error: code = NotFound desc = could not find container \"86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61\": container with ID starting with 86e9ddae336eb8d1127a979a469f96f113a9d4334000f07e8fee9e16a8373c61 not found: ID does not exist" Oct 09 11:02:23 crc kubenswrapper[4740]: I1009 11:02:23.764994 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" path="/var/lib/kubelet/pods/c1708acf-572a-429e-85a6-210c18fdcaa1/volumes" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.423707 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hj9h"] Oct 09 11:02:43 crc kubenswrapper[4740]: E1009 11:02:43.425392 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerName="extract-content" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.425410 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerName="extract-content" Oct 09 11:02:43 crc kubenswrapper[4740]: E1009 11:02:43.425435 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerName="extract-utilities" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.425442 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerName="extract-utilities" Oct 09 11:02:43 crc kubenswrapper[4740]: E1009 11:02:43.425453 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerName="registry-server" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.425459 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerName="registry-server" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.425651 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1708acf-572a-429e-85a6-210c18fdcaa1" containerName="registry-server" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.427223 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.474997 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hj9h"] Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.550082 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-utilities\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.550354 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xj84\" (UniqueName: \"kubernetes.io/projected/99ee0c70-ffa9-4ad5-9607-68c861bd383e-kube-api-access-9xj84\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.550424 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-catalog-content\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.651834 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xj84\" (UniqueName: \"kubernetes.io/projected/99ee0c70-ffa9-4ad5-9607-68c861bd383e-kube-api-access-9xj84\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.652338 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-catalog-content\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.652378 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-utilities\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.652864 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-catalog-content\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.653012 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-utilities\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.671471 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xj84\" (UniqueName: \"kubernetes.io/projected/99ee0c70-ffa9-4ad5-9607-68c861bd383e-kube-api-access-9xj84\") pod \"certified-operators-4hj9h\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:43 crc kubenswrapper[4740]: I1009 11:02:43.775399 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:44 crc kubenswrapper[4740]: I1009 11:02:44.320082 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hj9h"] Oct 09 11:02:45 crc kubenswrapper[4740]: I1009 11:02:45.356657 4740 generic.go:334] "Generic (PLEG): container finished" podID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerID="e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde" exitCode=0 Oct 09 11:02:45 crc kubenswrapper[4740]: I1009 11:02:45.356702 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hj9h" event={"ID":"99ee0c70-ffa9-4ad5-9607-68c861bd383e","Type":"ContainerDied","Data":"e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde"} Oct 09 11:02:45 crc kubenswrapper[4740]: I1009 11:02:45.357109 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hj9h" event={"ID":"99ee0c70-ffa9-4ad5-9607-68c861bd383e","Type":"ContainerStarted","Data":"06cbfe9fb89af8d1df04574bacd2db464147292f59c49ee2473155ae53f2be20"} Oct 09 11:02:46 crc kubenswrapper[4740]: I1009 11:02:46.371418 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hj9h" event={"ID":"99ee0c70-ffa9-4ad5-9607-68c861bd383e","Type":"ContainerStarted","Data":"2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032"} Oct 09 11:02:47 crc kubenswrapper[4740]: I1009 11:02:47.381943 4740 generic.go:334] "Generic (PLEG): container finished" podID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerID="2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032" exitCode=0 Oct 09 11:02:47 crc kubenswrapper[4740]: I1009 11:02:47.381982 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hj9h" event={"ID":"99ee0c70-ffa9-4ad5-9607-68c861bd383e","Type":"ContainerDied","Data":"2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032"} Oct 09 11:02:48 crc kubenswrapper[4740]: I1009 11:02:48.396803 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hj9h" event={"ID":"99ee0c70-ffa9-4ad5-9607-68c861bd383e","Type":"ContainerStarted","Data":"a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7"} Oct 09 11:02:48 crc kubenswrapper[4740]: I1009 11:02:48.400426 4740 generic.go:334] "Generic (PLEG): container finished" podID="cdb17de8-f861-4899-8e4d-455cd554cf43" containerID="704cd8b27369e455de86884f93f12bca82b99d9e44f131aaf94b9033eeabf7d3" exitCode=0 Oct 09 11:02:48 crc kubenswrapper[4740]: I1009 11:02:48.400472 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" event={"ID":"cdb17de8-f861-4899-8e4d-455cd554cf43","Type":"ContainerDied","Data":"704cd8b27369e455de86884f93f12bca82b99d9e44f131aaf94b9033eeabf7d3"} Oct 09 11:02:48 crc kubenswrapper[4740]: I1009 11:02:48.419782 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hj9h" podStartSLOduration=2.99787888 podStartE2EDuration="5.419740692s" podCreationTimestamp="2025-10-09 11:02:43 +0000 UTC" firstStartedPulling="2025-10-09 11:02:45.359222837 +0000 UTC m=+2104.321423218" lastFinishedPulling="2025-10-09 11:02:47.781084649 +0000 UTC m=+2106.743285030" observedRunningTime="2025-10-09 11:02:48.416053671 +0000 UTC m=+2107.378254072" watchObservedRunningTime="2025-10-09 11:02:48.419740692 +0000 UTC m=+2107.381941073" Oct 09 11:02:49 crc kubenswrapper[4740]: I1009 11:02:49.822275 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:02:49 crc kubenswrapper[4740]: I1009 11:02:49.992829 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-metadata-combined-ca-bundle\") pod \"cdb17de8-f861-4899-8e4d-455cd554cf43\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " Oct 09 11:02:49 crc kubenswrapper[4740]: I1009 11:02:49.992905 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-ssh-key\") pod \"cdb17de8-f861-4899-8e4d-455cd554cf43\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " Oct 09 11:02:49 crc kubenswrapper[4740]: I1009 11:02:49.992942 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwr57\" (UniqueName: \"kubernetes.io/projected/cdb17de8-f861-4899-8e4d-455cd554cf43-kube-api-access-jwr57\") pod \"cdb17de8-f861-4899-8e4d-455cd554cf43\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " Oct 09 11:02:49 crc kubenswrapper[4740]: I1009 11:02:49.992968 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-inventory\") pod \"cdb17de8-f861-4899-8e4d-455cd554cf43\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " Oct 09 11:02:49 crc kubenswrapper[4740]: I1009 11:02:49.993084 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-nova-metadata-neutron-config-0\") pod \"cdb17de8-f861-4899-8e4d-455cd554cf43\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " Oct 09 11:02:49 crc kubenswrapper[4740]: I1009 11:02:49.993114 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cdb17de8-f861-4899-8e4d-455cd554cf43\" (UID: \"cdb17de8-f861-4899-8e4d-455cd554cf43\") " Oct 09 11:02:49 crc kubenswrapper[4740]: I1009 11:02:49.999332 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cdb17de8-f861-4899-8e4d-455cd554cf43" (UID: "cdb17de8-f861-4899-8e4d-455cd554cf43"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.004043 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb17de8-f861-4899-8e4d-455cd554cf43-kube-api-access-jwr57" (OuterVolumeSpecName: "kube-api-access-jwr57") pod "cdb17de8-f861-4899-8e4d-455cd554cf43" (UID: "cdb17de8-f861-4899-8e4d-455cd554cf43"). InnerVolumeSpecName "kube-api-access-jwr57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.021347 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-inventory" (OuterVolumeSpecName: "inventory") pod "cdb17de8-f861-4899-8e4d-455cd554cf43" (UID: "cdb17de8-f861-4899-8e4d-455cd554cf43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.021415 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cdb17de8-f861-4899-8e4d-455cd554cf43" (UID: "cdb17de8-f861-4899-8e4d-455cd554cf43"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.022783 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cdb17de8-f861-4899-8e4d-455cd554cf43" (UID: "cdb17de8-f861-4899-8e4d-455cd554cf43"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.049118 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cdb17de8-f861-4899-8e4d-455cd554cf43" (UID: "cdb17de8-f861-4899-8e4d-455cd554cf43"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.095753 4740 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.095842 4740 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.095855 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.095866 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwr57\" (UniqueName: \"kubernetes.io/projected/cdb17de8-f861-4899-8e4d-455cd554cf43-kube-api-access-jwr57\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.095875 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.095883 4740 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdb17de8-f861-4899-8e4d-455cd554cf43-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.420006 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" event={"ID":"cdb17de8-f861-4899-8e4d-455cd554cf43","Type":"ContainerDied","Data":"38b936a209c6a36ad071f356cd86f61a92a25365ce07f22db52510ae3d799b51"} Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.420043 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38b936a209c6a36ad071f356cd86f61a92a25365ce07f22db52510ae3d799b51" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.420077 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.537185 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72"] Oct 09 11:02:50 crc kubenswrapper[4740]: E1009 11:02:50.537557 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb17de8-f861-4899-8e4d-455cd554cf43" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.537572 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb17de8-f861-4899-8e4d-455cd554cf43" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.537910 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb17de8-f861-4899-8e4d-455cd554cf43" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.538508 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.540630 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.541649 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.541929 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.541952 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.542988 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.549636 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72"] Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.707154 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrs9\" (UniqueName: \"kubernetes.io/projected/55748bea-018d-4297-8939-ffec480b42ba-kube-api-access-7xrs9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.707363 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.707409 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.707542 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.707581 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.809123 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.809268 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.809369 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xrs9\" (UniqueName: \"kubernetes.io/projected/55748bea-018d-4297-8939-ffec480b42ba-kube-api-access-7xrs9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.809523 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.809636 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.814181 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.815345 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.816502 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.817164 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.841824 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xrs9\" (UniqueName: \"kubernetes.io/projected/55748bea-018d-4297-8939-ffec480b42ba-kube-api-access-7xrs9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg72\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:50 crc kubenswrapper[4740]: I1009 11:02:50.856609 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:02:51 crc kubenswrapper[4740]: I1009 11:02:51.412308 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72"] Oct 09 11:02:51 crc kubenswrapper[4740]: W1009 11:02:51.417389 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55748bea_018d_4297_8939_ffec480b42ba.slice/crio-32ec2149a21ebd4f0d59ce47e32d9d9351c3cc6fd695543b0f732aae731fe280 WatchSource:0}: Error finding container 32ec2149a21ebd4f0d59ce47e32d9d9351c3cc6fd695543b0f732aae731fe280: Status 404 returned error can't find the container with id 32ec2149a21ebd4f0d59ce47e32d9d9351c3cc6fd695543b0f732aae731fe280 Oct 09 11:02:51 crc kubenswrapper[4740]: I1009 11:02:51.431113 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" event={"ID":"55748bea-018d-4297-8939-ffec480b42ba","Type":"ContainerStarted","Data":"32ec2149a21ebd4f0d59ce47e32d9d9351c3cc6fd695543b0f732aae731fe280"} Oct 09 11:02:52 crc kubenswrapper[4740]: I1009 11:02:52.442467 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" event={"ID":"55748bea-018d-4297-8939-ffec480b42ba","Type":"ContainerStarted","Data":"21cd9d6c898f88c84e4f099ebfbf139bd23eae8612400e3c4a6be9bc8f2f7457"} Oct 09 11:02:52 crc kubenswrapper[4740]: I1009 11:02:52.460042 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" podStartSLOduration=1.9342408309999999 podStartE2EDuration="2.460025795s" podCreationTimestamp="2025-10-09 11:02:50 +0000 UTC" firstStartedPulling="2025-10-09 11:02:51.420564918 +0000 UTC m=+2110.382765309" lastFinishedPulling="2025-10-09 11:02:51.946349852 +0000 UTC m=+2110.908550273" observedRunningTime="2025-10-09 11:02:52.458862883 +0000 UTC m=+2111.421063264" watchObservedRunningTime="2025-10-09 11:02:52.460025795 +0000 UTC m=+2111.422226176" Oct 09 11:02:53 crc kubenswrapper[4740]: I1009 11:02:53.775557 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:53 crc kubenswrapper[4740]: I1009 11:02:53.775975 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:53 crc kubenswrapper[4740]: I1009 11:02:53.829125 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:54 crc kubenswrapper[4740]: I1009 11:02:54.523568 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:54 crc kubenswrapper[4740]: I1009 11:02:54.567592 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hj9h"] Oct 09 11:02:56 crc kubenswrapper[4740]: I1009 11:02:56.489859 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4hj9h" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerName="registry-server" containerID="cri-o://a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7" gracePeriod=2 Oct 09 11:02:56 crc kubenswrapper[4740]: I1009 11:02:56.925733 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.048528 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xj84\" (UniqueName: \"kubernetes.io/projected/99ee0c70-ffa9-4ad5-9607-68c861bd383e-kube-api-access-9xj84\") pod \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.048654 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-utilities\") pod \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.048744 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-catalog-content\") pod \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\" (UID: \"99ee0c70-ffa9-4ad5-9607-68c861bd383e\") " Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.049910 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-utilities" (OuterVolumeSpecName: "utilities") pod "99ee0c70-ffa9-4ad5-9607-68c861bd383e" (UID: "99ee0c70-ffa9-4ad5-9607-68c861bd383e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.053822 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ee0c70-ffa9-4ad5-9607-68c861bd383e-kube-api-access-9xj84" (OuterVolumeSpecName: "kube-api-access-9xj84") pod "99ee0c70-ffa9-4ad5-9607-68c861bd383e" (UID: "99ee0c70-ffa9-4ad5-9607-68c861bd383e"). InnerVolumeSpecName "kube-api-access-9xj84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.115545 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99ee0c70-ffa9-4ad5-9607-68c861bd383e" (UID: "99ee0c70-ffa9-4ad5-9607-68c861bd383e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.150860 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xj84\" (UniqueName: \"kubernetes.io/projected/99ee0c70-ffa9-4ad5-9607-68c861bd383e-kube-api-access-9xj84\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.150897 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.150911 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ee0c70-ffa9-4ad5-9607-68c861bd383e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.504900 4740 generic.go:334] "Generic (PLEG): container finished" podID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerID="a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7" exitCode=0 Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.504953 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hj9h" event={"ID":"99ee0c70-ffa9-4ad5-9607-68c861bd383e","Type":"ContainerDied","Data":"a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7"} Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.504980 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hj9h" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.505002 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hj9h" event={"ID":"99ee0c70-ffa9-4ad5-9607-68c861bd383e","Type":"ContainerDied","Data":"06cbfe9fb89af8d1df04574bacd2db464147292f59c49ee2473155ae53f2be20"} Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.505033 4740 scope.go:117] "RemoveContainer" containerID="a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.523785 4740 scope.go:117] "RemoveContainer" containerID="2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.548596 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hj9h"] Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.562142 4740 scope.go:117] "RemoveContainer" containerID="e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.564581 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4hj9h"] Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.605147 4740 scope.go:117] "RemoveContainer" containerID="a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7" Oct 09 11:02:57 crc kubenswrapper[4740]: E1009 11:02:57.605678 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7\": container with ID starting with a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7 not found: ID does not exist" containerID="a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.605717 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7"} err="failed to get container status \"a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7\": rpc error: code = NotFound desc = could not find container \"a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7\": container with ID starting with a9a38a50cd89baa8c4fa56e1d4a72b758b04ec47f199d7661276d39b972f2ea7 not found: ID does not exist" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.605741 4740 scope.go:117] "RemoveContainer" containerID="2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032" Oct 09 11:02:57 crc kubenswrapper[4740]: E1009 11:02:57.606090 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032\": container with ID starting with 2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032 not found: ID does not exist" containerID="2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.606115 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032"} err="failed to get container status \"2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032\": rpc error: code = NotFound desc = could not find container \"2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032\": container with ID starting with 2491dc05a24c13e47dd1a42bc5dac8f05632628e2aeebbbfbbad717c86698032 not found: ID does not exist" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.606128 4740 scope.go:117] "RemoveContainer" containerID="e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde" Oct 09 11:02:57 crc kubenswrapper[4740]: E1009 11:02:57.606390 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde\": container with ID starting with e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde not found: ID does not exist" containerID="e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.606413 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde"} err="failed to get container status \"e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde\": rpc error: code = NotFound desc = could not find container \"e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde\": container with ID starting with e811f9979738d717fa15ff6455c745697feccb6f3ba1e1b29ff45375ba98fdde not found: ID does not exist" Oct 09 11:02:57 crc kubenswrapper[4740]: I1009 11:02:57.768339 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" path="/var/lib/kubelet/pods/99ee0c70-ffa9-4ad5-9607-68c861bd383e/volumes" Oct 09 11:03:35 crc kubenswrapper[4740]: I1009 11:03:35.407928 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:03:35 crc kubenswrapper[4740]: I1009 11:03:35.409036 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:04:05 crc kubenswrapper[4740]: I1009 11:04:05.407659 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:04:05 crc kubenswrapper[4740]: I1009 11:04:05.408807 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:04:35 crc kubenswrapper[4740]: I1009 11:04:35.411287 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:04:35 crc kubenswrapper[4740]: I1009 11:04:35.413041 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:04:35 crc kubenswrapper[4740]: I1009 11:04:35.413109 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 11:04:35 crc kubenswrapper[4740]: I1009 11:04:35.414009 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 11:04:35 crc kubenswrapper[4740]: I1009 11:04:35.414068 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" gracePeriod=600 Oct 09 11:04:35 crc kubenswrapper[4740]: E1009 11:04:35.537953 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:04:36 crc kubenswrapper[4740]: I1009 11:04:36.467891 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" exitCode=0 Oct 09 11:04:36 crc kubenswrapper[4740]: I1009 11:04:36.467981 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689"} Oct 09 11:04:36 crc kubenswrapper[4740]: I1009 11:04:36.468296 4740 scope.go:117] "RemoveContainer" containerID="3448287b6cd68c3403bb27caa7100e27359be5b949fa1d87e08098aaeba8b363" Oct 09 11:04:36 crc kubenswrapper[4740]: I1009 11:04:36.469056 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:04:36 crc kubenswrapper[4740]: E1009 11:04:36.469360 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:04:50 crc kubenswrapper[4740]: I1009 11:04:50.754308 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:04:50 crc kubenswrapper[4740]: E1009 11:04:50.755237 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:05:04 crc kubenswrapper[4740]: I1009 11:05:04.753978 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:05:04 crc kubenswrapper[4740]: E1009 11:05:04.754778 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:05:16 crc kubenswrapper[4740]: I1009 11:05:16.754471 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:05:16 crc kubenswrapper[4740]: E1009 11:05:16.755533 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.520495 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4xn6t"] Oct 09 11:05:24 crc kubenswrapper[4740]: E1009 11:05:24.521391 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerName="extract-content" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.521403 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerName="extract-content" Oct 09 11:05:24 crc kubenswrapper[4740]: E1009 11:05:24.521426 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerName="registry-server" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.521432 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerName="registry-server" Oct 09 11:05:24 crc kubenswrapper[4740]: E1009 11:05:24.521465 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerName="extract-utilities" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.521473 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerName="extract-utilities" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.521637 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ee0c70-ffa9-4ad5-9607-68c861bd383e" containerName="registry-server" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.522907 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.535430 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xn6t"] Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.651411 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-utilities\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.651473 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlml\" (UniqueName: \"kubernetes.io/projected/48e1bc9c-ece7-4736-a224-ac557fcced5e-kube-api-access-vjlml\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.651818 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-catalog-content\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.753367 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-utilities\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.753659 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlml\" (UniqueName: \"kubernetes.io/projected/48e1bc9c-ece7-4736-a224-ac557fcced5e-kube-api-access-vjlml\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.753903 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-catalog-content\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.753965 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-utilities\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.754299 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-catalog-content\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.774810 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlml\" (UniqueName: \"kubernetes.io/projected/48e1bc9c-ece7-4736-a224-ac557fcced5e-kube-api-access-vjlml\") pod \"redhat-operators-4xn6t\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:24 crc kubenswrapper[4740]: I1009 11:05:24.887358 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:25 crc kubenswrapper[4740]: I1009 11:05:25.367279 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xn6t"] Oct 09 11:05:25 crc kubenswrapper[4740]: I1009 11:05:25.923881 4740 generic.go:334] "Generic (PLEG): container finished" podID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerID="7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96" exitCode=0 Oct 09 11:05:25 crc kubenswrapper[4740]: I1009 11:05:25.923965 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xn6t" event={"ID":"48e1bc9c-ece7-4736-a224-ac557fcced5e","Type":"ContainerDied","Data":"7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96"} Oct 09 11:05:25 crc kubenswrapper[4740]: I1009 11:05:25.924160 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xn6t" event={"ID":"48e1bc9c-ece7-4736-a224-ac557fcced5e","Type":"ContainerStarted","Data":"c07e7127e4158721757cac4db772cf478748dad1003f603aeb2039fd06b46473"} Oct 09 11:05:25 crc kubenswrapper[4740]: I1009 11:05:25.925924 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 11:05:26 crc kubenswrapper[4740]: I1009 11:05:26.920572 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8qhn2"] Oct 09 11:05:26 crc kubenswrapper[4740]: I1009 11:05:26.922959 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:26 crc kubenswrapper[4740]: I1009 11:05:26.936810 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qhn2"] Oct 09 11:05:26 crc kubenswrapper[4740]: I1009 11:05:26.950559 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xn6t" event={"ID":"48e1bc9c-ece7-4736-a224-ac557fcced5e","Type":"ContainerStarted","Data":"8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78"} Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.002992 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r2vh\" (UniqueName: \"kubernetes.io/projected/352e94f5-8f4f-4868-b6c2-ec06e131e49d-kube-api-access-7r2vh\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.003094 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-utilities\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.003119 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-catalog-content\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.105009 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-utilities\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.105072 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-catalog-content\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.105693 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-utilities\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.105738 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r2vh\" (UniqueName: \"kubernetes.io/projected/352e94f5-8f4f-4868-b6c2-ec06e131e49d-kube-api-access-7r2vh\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.105554 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-catalog-content\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.134557 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r2vh\" (UniqueName: \"kubernetes.io/projected/352e94f5-8f4f-4868-b6c2-ec06e131e49d-kube-api-access-7r2vh\") pod \"community-operators-8qhn2\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.259349 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.808865 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qhn2"] Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.966205 4740 generic.go:334] "Generic (PLEG): container finished" podID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerID="8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78" exitCode=0 Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.966463 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xn6t" event={"ID":"48e1bc9c-ece7-4736-a224-ac557fcced5e","Type":"ContainerDied","Data":"8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78"} Oct 09 11:05:27 crc kubenswrapper[4740]: I1009 11:05:27.970825 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qhn2" event={"ID":"352e94f5-8f4f-4868-b6c2-ec06e131e49d","Type":"ContainerStarted","Data":"abc9c9b795003ded5229f66630f96aab6bb42c3e285220022b39b219cc87f8c8"} Oct 09 11:05:28 crc kubenswrapper[4740]: I1009 11:05:28.981279 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xn6t" event={"ID":"48e1bc9c-ece7-4736-a224-ac557fcced5e","Type":"ContainerStarted","Data":"15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590"} Oct 09 11:05:28 crc kubenswrapper[4740]: I1009 11:05:28.984349 4740 generic.go:334] "Generic (PLEG): container finished" podID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerID="6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4" exitCode=0 Oct 09 11:05:28 crc kubenswrapper[4740]: I1009 11:05:28.984509 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qhn2" event={"ID":"352e94f5-8f4f-4868-b6c2-ec06e131e49d","Type":"ContainerDied","Data":"6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4"} Oct 09 11:05:29 crc kubenswrapper[4740]: I1009 11:05:29.010291 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4xn6t" podStartSLOduration=2.529501084 podStartE2EDuration="5.010270954s" podCreationTimestamp="2025-10-09 11:05:24 +0000 UTC" firstStartedPulling="2025-10-09 11:05:25.925654458 +0000 UTC m=+2264.887854839" lastFinishedPulling="2025-10-09 11:05:28.406424338 +0000 UTC m=+2267.368624709" observedRunningTime="2025-10-09 11:05:28.998996378 +0000 UTC m=+2267.961196759" watchObservedRunningTime="2025-10-09 11:05:29.010270954 +0000 UTC m=+2267.972471345" Oct 09 11:05:29 crc kubenswrapper[4740]: I1009 11:05:29.754116 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:05:29 crc kubenswrapper[4740]: E1009 11:05:29.754382 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:05:31 crc kubenswrapper[4740]: I1009 11:05:31.003240 4740 generic.go:334] "Generic (PLEG): container finished" podID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerID="1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969" exitCode=0 Oct 09 11:05:31 crc kubenswrapper[4740]: I1009 11:05:31.003284 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qhn2" event={"ID":"352e94f5-8f4f-4868-b6c2-ec06e131e49d","Type":"ContainerDied","Data":"1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969"} Oct 09 11:05:32 crc kubenswrapper[4740]: I1009 11:05:32.012889 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qhn2" event={"ID":"352e94f5-8f4f-4868-b6c2-ec06e131e49d","Type":"ContainerStarted","Data":"052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f"} Oct 09 11:05:32 crc kubenswrapper[4740]: I1009 11:05:32.035040 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8qhn2" podStartSLOduration=3.514867232 podStartE2EDuration="6.035017453s" podCreationTimestamp="2025-10-09 11:05:26 +0000 UTC" firstStartedPulling="2025-10-09 11:05:28.986000314 +0000 UTC m=+2267.948200695" lastFinishedPulling="2025-10-09 11:05:31.506150535 +0000 UTC m=+2270.468350916" observedRunningTime="2025-10-09 11:05:32.028630989 +0000 UTC m=+2270.990831370" watchObservedRunningTime="2025-10-09 11:05:32.035017453 +0000 UTC m=+2270.997217834" Oct 09 11:05:34 crc kubenswrapper[4740]: I1009 11:05:34.888063 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:34 crc kubenswrapper[4740]: I1009 11:05:34.889180 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:34 crc kubenswrapper[4740]: I1009 11:05:34.935351 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:35 crc kubenswrapper[4740]: I1009 11:05:35.100795 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:35 crc kubenswrapper[4740]: I1009 11:05:35.901369 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xn6t"] Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.059741 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4xn6t" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerName="registry-server" containerID="cri-o://15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590" gracePeriod=2 Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.260523 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.260590 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.338676 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.556638 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.627300 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-catalog-content\") pod \"48e1bc9c-ece7-4736-a224-ac557fcced5e\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.627561 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-utilities\") pod \"48e1bc9c-ece7-4736-a224-ac557fcced5e\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.627655 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjlml\" (UniqueName: \"kubernetes.io/projected/48e1bc9c-ece7-4736-a224-ac557fcced5e-kube-api-access-vjlml\") pod \"48e1bc9c-ece7-4736-a224-ac557fcced5e\" (UID: \"48e1bc9c-ece7-4736-a224-ac557fcced5e\") " Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.628583 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-utilities" (OuterVolumeSpecName: "utilities") pod "48e1bc9c-ece7-4736-a224-ac557fcced5e" (UID: "48e1bc9c-ece7-4736-a224-ac557fcced5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.633462 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e1bc9c-ece7-4736-a224-ac557fcced5e-kube-api-access-vjlml" (OuterVolumeSpecName: "kube-api-access-vjlml") pod "48e1bc9c-ece7-4736-a224-ac557fcced5e" (UID: "48e1bc9c-ece7-4736-a224-ac557fcced5e"). InnerVolumeSpecName "kube-api-access-vjlml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.721840 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48e1bc9c-ece7-4736-a224-ac557fcced5e" (UID: "48e1bc9c-ece7-4736-a224-ac557fcced5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.729818 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.729863 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjlml\" (UniqueName: \"kubernetes.io/projected/48e1bc9c-ece7-4736-a224-ac557fcced5e-kube-api-access-vjlml\") on node \"crc\" DevicePath \"\"" Oct 09 11:05:37 crc kubenswrapper[4740]: I1009 11:05:37.729875 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e1bc9c-ece7-4736-a224-ac557fcced5e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:05:37 crc kubenswrapper[4740]: E1009 11:05:37.951091 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48e1bc9c_ece7_4736_a224_ac557fcced5e.slice/crio-c07e7127e4158721757cac4db772cf478748dad1003f603aeb2039fd06b46473\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48e1bc9c_ece7_4736_a224_ac557fcced5e.slice\": RecentStats: unable to find data in memory cache]" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.077145 4740 generic.go:334] "Generic (PLEG): container finished" podID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerID="15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590" exitCode=0 Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.078164 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xn6t" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.078624 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xn6t" event={"ID":"48e1bc9c-ece7-4736-a224-ac557fcced5e","Type":"ContainerDied","Data":"15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590"} Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.078993 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xn6t" event={"ID":"48e1bc9c-ece7-4736-a224-ac557fcced5e","Type":"ContainerDied","Data":"c07e7127e4158721757cac4db772cf478748dad1003f603aeb2039fd06b46473"} Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.079048 4740 scope.go:117] "RemoveContainer" containerID="15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.113076 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xn6t"] Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.119108 4740 scope.go:117] "RemoveContainer" containerID="8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.124154 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4xn6t"] Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.133082 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.138998 4740 scope.go:117] "RemoveContainer" containerID="7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.186732 4740 scope.go:117] "RemoveContainer" containerID="15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590" Oct 09 11:05:38 crc kubenswrapper[4740]: E1009 11:05:38.187112 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590\": container with ID starting with 15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590 not found: ID does not exist" containerID="15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.187191 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590"} err="failed to get container status \"15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590\": rpc error: code = NotFound desc = could not find container \"15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590\": container with ID starting with 15c4df7cfdc78038c4526068cfdb52660ef9327462c64ce26a758a9cf63c8590 not found: ID does not exist" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.187218 4740 scope.go:117] "RemoveContainer" containerID="8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78" Oct 09 11:05:38 crc kubenswrapper[4740]: E1009 11:05:38.187692 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78\": container with ID starting with 8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78 not found: ID does not exist" containerID="8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.187724 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78"} err="failed to get container status \"8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78\": rpc error: code = NotFound desc = could not find container \"8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78\": container with ID starting with 8dcb833c07c5c53b64ecbee26b5cd539a800429df35c9e58415cd6805bdd5e78 not found: ID does not exist" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.187768 4740 scope.go:117] "RemoveContainer" containerID="7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96" Oct 09 11:05:38 crc kubenswrapper[4740]: E1009 11:05:38.188014 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96\": container with ID starting with 7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96 not found: ID does not exist" containerID="7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.188037 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96"} err="failed to get container status \"7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96\": rpc error: code = NotFound desc = could not find container \"7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96\": container with ID starting with 7ec1af3aa15192cda9884a91fbdc61e80013380f6518063a71916997a4058e96 not found: ID does not exist" Oct 09 11:05:38 crc kubenswrapper[4740]: I1009 11:05:38.904025 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qhn2"] Oct 09 11:05:39 crc kubenswrapper[4740]: I1009 11:05:39.765136 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" path="/var/lib/kubelet/pods/48e1bc9c-ece7-4736-a224-ac557fcced5e/volumes" Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.098812 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8qhn2" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerName="registry-server" containerID="cri-o://052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f" gracePeriod=2 Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.532731 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.592046 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-catalog-content\") pod \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.592144 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r2vh\" (UniqueName: \"kubernetes.io/projected/352e94f5-8f4f-4868-b6c2-ec06e131e49d-kube-api-access-7r2vh\") pod \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.592257 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-utilities\") pod \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\" (UID: \"352e94f5-8f4f-4868-b6c2-ec06e131e49d\") " Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.593606 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-utilities" (OuterVolumeSpecName: "utilities") pod "352e94f5-8f4f-4868-b6c2-ec06e131e49d" (UID: "352e94f5-8f4f-4868-b6c2-ec06e131e49d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.599048 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352e94f5-8f4f-4868-b6c2-ec06e131e49d-kube-api-access-7r2vh" (OuterVolumeSpecName: "kube-api-access-7r2vh") pod "352e94f5-8f4f-4868-b6c2-ec06e131e49d" (UID: "352e94f5-8f4f-4868-b6c2-ec06e131e49d"). InnerVolumeSpecName "kube-api-access-7r2vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.655803 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "352e94f5-8f4f-4868-b6c2-ec06e131e49d" (UID: "352e94f5-8f4f-4868-b6c2-ec06e131e49d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.694609 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.694649 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r2vh\" (UniqueName: \"kubernetes.io/projected/352e94f5-8f4f-4868-b6c2-ec06e131e49d-kube-api-access-7r2vh\") on node \"crc\" DevicePath \"\"" Oct 09 11:05:40 crc kubenswrapper[4740]: I1009 11:05:40.694664 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e94f5-8f4f-4868-b6c2-ec06e131e49d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.112271 4740 generic.go:334] "Generic (PLEG): container finished" podID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerID="052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f" exitCode=0 Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.112318 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qhn2" event={"ID":"352e94f5-8f4f-4868-b6c2-ec06e131e49d","Type":"ContainerDied","Data":"052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f"} Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.112357 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qhn2" event={"ID":"352e94f5-8f4f-4868-b6c2-ec06e131e49d","Type":"ContainerDied","Data":"abc9c9b795003ded5229f66630f96aab6bb42c3e285220022b39b219cc87f8c8"} Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.112368 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qhn2" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.112375 4740 scope.go:117] "RemoveContainer" containerID="052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.130702 4740 scope.go:117] "RemoveContainer" containerID="1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.158664 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qhn2"] Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.163002 4740 scope.go:117] "RemoveContainer" containerID="6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.167302 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8qhn2"] Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.213857 4740 scope.go:117] "RemoveContainer" containerID="052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f" Oct 09 11:05:41 crc kubenswrapper[4740]: E1009 11:05:41.214383 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f\": container with ID starting with 052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f not found: ID does not exist" containerID="052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.214418 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f"} err="failed to get container status \"052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f\": rpc error: code = NotFound desc = could not find container \"052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f\": container with ID starting with 052aa94641925a63b59aaa06776f1d89806377569b69bc5ac4d815dfd515f22f not found: ID does not exist" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.214469 4740 scope.go:117] "RemoveContainer" containerID="1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969" Oct 09 11:05:41 crc kubenswrapper[4740]: E1009 11:05:41.214742 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969\": container with ID starting with 1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969 not found: ID does not exist" containerID="1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.214774 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969"} err="failed to get container status \"1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969\": rpc error: code = NotFound desc = could not find container \"1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969\": container with ID starting with 1d46d110aed393aa384c41b1aa453cef44d06ec9293310ccc6808c4a5687c969 not found: ID does not exist" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.214785 4740 scope.go:117] "RemoveContainer" containerID="6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4" Oct 09 11:05:41 crc kubenswrapper[4740]: E1009 11:05:41.215033 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4\": container with ID starting with 6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4 not found: ID does not exist" containerID="6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.215058 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4"} err="failed to get container status \"6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4\": rpc error: code = NotFound desc = could not find container \"6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4\": container with ID starting with 6af4a2e84298b087405992e6d60136625a99525ad6fddec7fddb523dd36349c4 not found: ID does not exist" Oct 09 11:05:41 crc kubenswrapper[4740]: I1009 11:05:41.768935 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" path="/var/lib/kubelet/pods/352e94f5-8f4f-4868-b6c2-ec06e131e49d/volumes" Oct 09 11:05:42 crc kubenswrapper[4740]: I1009 11:05:42.753738 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:05:42 crc kubenswrapper[4740]: E1009 11:05:42.754035 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:05:56 crc kubenswrapper[4740]: I1009 11:05:56.754268 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:05:56 crc kubenswrapper[4740]: E1009 11:05:56.754971 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:06:10 crc kubenswrapper[4740]: I1009 11:06:10.753714 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:06:10 crc kubenswrapper[4740]: E1009 11:06:10.754535 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:06:25 crc kubenswrapper[4740]: I1009 11:06:25.753887 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:06:25 crc kubenswrapper[4740]: E1009 11:06:25.754860 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:06:39 crc kubenswrapper[4740]: I1009 11:06:39.754288 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:06:39 crc kubenswrapper[4740]: E1009 11:06:39.755109 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:06:51 crc kubenswrapper[4740]: I1009 11:06:51.762265 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:06:51 crc kubenswrapper[4740]: E1009 11:06:51.763279 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:07:02 crc kubenswrapper[4740]: I1009 11:07:02.753853 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:07:02 crc kubenswrapper[4740]: E1009 11:07:02.754577 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:07:09 crc kubenswrapper[4740]: I1009 11:07:09.933947 4740 generic.go:334] "Generic (PLEG): container finished" podID="55748bea-018d-4297-8939-ffec480b42ba" containerID="21cd9d6c898f88c84e4f099ebfbf139bd23eae8612400e3c4a6be9bc8f2f7457" exitCode=0 Oct 09 11:07:09 crc kubenswrapper[4740]: I1009 11:07:09.933997 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" event={"ID":"55748bea-018d-4297-8939-ffec480b42ba","Type":"ContainerDied","Data":"21cd9d6c898f88c84e4f099ebfbf139bd23eae8612400e3c4a6be9bc8f2f7457"} Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.385825 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.475176 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-secret-0\") pod \"55748bea-018d-4297-8939-ffec480b42ba\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.475233 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-inventory\") pod \"55748bea-018d-4297-8939-ffec480b42ba\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.475363 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-ssh-key\") pod \"55748bea-018d-4297-8939-ffec480b42ba\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.475446 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xrs9\" (UniqueName: \"kubernetes.io/projected/55748bea-018d-4297-8939-ffec480b42ba-kube-api-access-7xrs9\") pod \"55748bea-018d-4297-8939-ffec480b42ba\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.475558 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-combined-ca-bundle\") pod \"55748bea-018d-4297-8939-ffec480b42ba\" (UID: \"55748bea-018d-4297-8939-ffec480b42ba\") " Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.488261 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55748bea-018d-4297-8939-ffec480b42ba-kube-api-access-7xrs9" (OuterVolumeSpecName: "kube-api-access-7xrs9") pod "55748bea-018d-4297-8939-ffec480b42ba" (UID: "55748bea-018d-4297-8939-ffec480b42ba"). InnerVolumeSpecName "kube-api-access-7xrs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.508938 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "55748bea-018d-4297-8939-ffec480b42ba" (UID: "55748bea-018d-4297-8939-ffec480b42ba"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.509461 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-inventory" (OuterVolumeSpecName: "inventory") pod "55748bea-018d-4297-8939-ffec480b42ba" (UID: "55748bea-018d-4297-8939-ffec480b42ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.509893 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "55748bea-018d-4297-8939-ffec480b42ba" (UID: "55748bea-018d-4297-8939-ffec480b42ba"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.516156 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "55748bea-018d-4297-8939-ffec480b42ba" (UID: "55748bea-018d-4297-8939-ffec480b42ba"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.577186 4740 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.577224 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.577236 4740 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.577244 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55748bea-018d-4297-8939-ffec480b42ba-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.577251 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xrs9\" (UniqueName: \"kubernetes.io/projected/55748bea-018d-4297-8939-ffec480b42ba-kube-api-access-7xrs9\") on node \"crc\" DevicePath \"\"" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.963596 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" event={"ID":"55748bea-018d-4297-8939-ffec480b42ba","Type":"ContainerDied","Data":"32ec2149a21ebd4f0d59ce47e32d9d9351c3cc6fd695543b0f732aae731fe280"} Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.964023 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ec2149a21ebd4f0d59ce47e32d9d9351c3cc6fd695543b0f732aae731fe280" Oct 09 11:07:11 crc kubenswrapper[4740]: I1009 11:07:11.963696 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg72" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050214 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr"] Oct 09 11:07:12 crc kubenswrapper[4740]: E1009 11:07:12.050569 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerName="extract-content" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050585 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerName="extract-content" Oct 09 11:07:12 crc kubenswrapper[4740]: E1009 11:07:12.050606 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerName="registry-server" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050612 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerName="registry-server" Oct 09 11:07:12 crc kubenswrapper[4740]: E1009 11:07:12.050625 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerName="extract-content" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050631 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerName="extract-content" Oct 09 11:07:12 crc kubenswrapper[4740]: E1009 11:07:12.050645 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerName="registry-server" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050650 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerName="registry-server" Oct 09 11:07:12 crc kubenswrapper[4740]: E1009 11:07:12.050665 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerName="extract-utilities" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050674 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerName="extract-utilities" Oct 09 11:07:12 crc kubenswrapper[4740]: E1009 11:07:12.050693 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55748bea-018d-4297-8939-ffec480b42ba" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050699 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="55748bea-018d-4297-8939-ffec480b42ba" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 11:07:12 crc kubenswrapper[4740]: E1009 11:07:12.050705 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerName="extract-utilities" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050711 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerName="extract-utilities" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050884 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="55748bea-018d-4297-8939-ffec480b42ba" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050896 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e1bc9c-ece7-4736-a224-ac557fcced5e" containerName="registry-server" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.050908 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="352e94f5-8f4f-4868-b6c2-ec06e131e49d" containerName="registry-server" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.051503 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.054420 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.055119 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.055288 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.055437 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.055729 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.055912 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.064143 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr"] Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.065490 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.189845 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.189995 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.190027 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.190068 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.190113 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.190157 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.190199 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.190219 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjsd9\" (UniqueName: \"kubernetes.io/projected/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-kube-api-access-jjsd9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.190245 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.292940 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.293009 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjsd9\" (UniqueName: \"kubernetes.io/projected/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-kube-api-access-jjsd9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.293055 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.293130 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.293304 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.293341 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.293402 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.293482 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.293573 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.294346 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.308554 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.309484 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.312262 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.314179 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.314899 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.321991 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.335972 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjsd9\" (UniqueName: \"kubernetes.io/projected/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-kube-api-access-jjsd9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.343585 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-frpjr\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.375202 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.915785 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr"] Oct 09 11:07:12 crc kubenswrapper[4740]: W1009 11:07:12.919251 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod914ecc49_bc7d_4c7d_b2f7_1fd9fbb42207.slice/crio-3a37a3bfbd5f6dbe835d8f7b6bdc68310ae1dd2fd981ac24f5bcad6e56977b39 WatchSource:0}: Error finding container 3a37a3bfbd5f6dbe835d8f7b6bdc68310ae1dd2fd981ac24f5bcad6e56977b39: Status 404 returned error can't find the container with id 3a37a3bfbd5f6dbe835d8f7b6bdc68310ae1dd2fd981ac24f5bcad6e56977b39 Oct 09 11:07:12 crc kubenswrapper[4740]: I1009 11:07:12.984801 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" event={"ID":"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207","Type":"ContainerStarted","Data":"3a37a3bfbd5f6dbe835d8f7b6bdc68310ae1dd2fd981ac24f5bcad6e56977b39"} Oct 09 11:07:14 crc kubenswrapper[4740]: I1009 11:07:14.003675 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" event={"ID":"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207","Type":"ContainerStarted","Data":"b784f0ad80d27b4e0f1d29ecb116148da7d147c8df791c26983efb0f236b76a8"} Oct 09 11:07:17 crc kubenswrapper[4740]: I1009 11:07:17.753835 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:07:17 crc kubenswrapper[4740]: E1009 11:07:17.754906 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:07:30 crc kubenswrapper[4740]: I1009 11:07:30.754421 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:07:30 crc kubenswrapper[4740]: E1009 11:07:30.755310 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:07:41 crc kubenswrapper[4740]: I1009 11:07:41.762913 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:07:41 crc kubenswrapper[4740]: E1009 11:07:41.764339 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:07:54 crc kubenswrapper[4740]: I1009 11:07:54.753485 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:07:54 crc kubenswrapper[4740]: E1009 11:07:54.754284 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:08:08 crc kubenswrapper[4740]: I1009 11:08:08.753893 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:08:08 crc kubenswrapper[4740]: E1009 11:08:08.754734 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:08:20 crc kubenswrapper[4740]: I1009 11:08:20.754572 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:08:20 crc kubenswrapper[4740]: E1009 11:08:20.755636 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:08:33 crc kubenswrapper[4740]: I1009 11:08:33.753676 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:08:33 crc kubenswrapper[4740]: E1009 11:08:33.754454 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:08:48 crc kubenswrapper[4740]: I1009 11:08:48.753385 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:08:48 crc kubenswrapper[4740]: E1009 11:08:48.754075 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:08:59 crc kubenswrapper[4740]: I1009 11:08:59.754735 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:08:59 crc kubenswrapper[4740]: E1009 11:08:59.755630 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:09:10 crc kubenswrapper[4740]: I1009 11:09:10.753773 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:09:10 crc kubenswrapper[4740]: E1009 11:09:10.754787 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:09:23 crc kubenswrapper[4740]: I1009 11:09:23.754510 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:09:23 crc kubenswrapper[4740]: E1009 11:09:23.755652 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:09:34 crc kubenswrapper[4740]: I1009 11:09:34.757583 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:09:34 crc kubenswrapper[4740]: E1009 11:09:34.758552 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:09:49 crc kubenswrapper[4740]: I1009 11:09:49.753852 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:09:50 crc kubenswrapper[4740]: I1009 11:09:50.552848 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"4f224417477db387e7ae5d61638574b08df79764d680b81f13a9b4e20caf519e"} Oct 09 11:09:50 crc kubenswrapper[4740]: I1009 11:09:50.578284 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" podStartSLOduration=158.046469915 podStartE2EDuration="2m38.578268292s" podCreationTimestamp="2025-10-09 11:07:12 +0000 UTC" firstStartedPulling="2025-10-09 11:07:12.923456981 +0000 UTC m=+2371.885657382" lastFinishedPulling="2025-10-09 11:07:13.455255378 +0000 UTC m=+2372.417455759" observedRunningTime="2025-10-09 11:07:14.027326469 +0000 UTC m=+2372.989526880" watchObservedRunningTime="2025-10-09 11:09:50.578268292 +0000 UTC m=+2529.540468673" Oct 09 11:10:27 crc kubenswrapper[4740]: I1009 11:10:27.900196 4740 generic.go:334] "Generic (PLEG): container finished" podID="914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" containerID="b784f0ad80d27b4e0f1d29ecb116148da7d147c8df791c26983efb0f236b76a8" exitCode=0 Oct 09 11:10:27 crc kubenswrapper[4740]: I1009 11:10:27.900266 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" event={"ID":"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207","Type":"ContainerDied","Data":"b784f0ad80d27b4e0f1d29ecb116148da7d147c8df791c26983efb0f236b76a8"} Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.306789 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.455821 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-ssh-key\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.455931 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-1\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.455979 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-0\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.456005 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjsd9\" (UniqueName: \"kubernetes.io/projected/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-kube-api-access-jjsd9\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.456115 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-extra-config-0\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.456135 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-1\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.456152 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-0\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.456179 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-combined-ca-bundle\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.456195 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-inventory\") pod \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\" (UID: \"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207\") " Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.462927 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.462934 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-kube-api-access-jjsd9" (OuterVolumeSpecName: "kube-api-access-jjsd9") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "kube-api-access-jjsd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.486093 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-inventory" (OuterVolumeSpecName: "inventory") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.486243 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.490087 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.493606 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.498894 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.499728 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.503951 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" (UID: "914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559116 4740 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559159 4740 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559173 4740 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559185 4740 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559198 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559207 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559218 4740 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559228 4740 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.559241 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjsd9\" (UniqueName: \"kubernetes.io/projected/914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207-kube-api-access-jjsd9\") on node \"crc\" DevicePath \"\"" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.921380 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" event={"ID":"914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207","Type":"ContainerDied","Data":"3a37a3bfbd5f6dbe835d8f7b6bdc68310ae1dd2fd981ac24f5bcad6e56977b39"} Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.921431 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a37a3bfbd5f6dbe835d8f7b6bdc68310ae1dd2fd981ac24f5bcad6e56977b39" Oct 09 11:10:29 crc kubenswrapper[4740]: I1009 11:10:29.921523 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-frpjr" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.012806 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft"] Oct 09 11:10:30 crc kubenswrapper[4740]: E1009 11:10:30.013201 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.013217 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.013414 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.014091 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.016569 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.021374 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.021439 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.021541 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.021656 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hslsm" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.033032 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft"] Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.170954 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.171304 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqb6j\" (UniqueName: \"kubernetes.io/projected/40e8133a-5380-4983-a96f-8f28d50108a9-kube-api-access-zqb6j\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.171333 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.171360 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.171444 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.171478 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.171539 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.273564 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.273630 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.273686 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.273740 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.273783 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqb6j\" (UniqueName: \"kubernetes.io/projected/40e8133a-5380-4983-a96f-8f28d50108a9-kube-api-access-zqb6j\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.273805 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.273827 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.277425 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.277960 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.278076 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.278208 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.278516 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.279995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.291707 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqb6j\" (UniqueName: \"kubernetes.io/projected/40e8133a-5380-4983-a96f-8f28d50108a9-kube-api-access-zqb6j\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ggkft\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.333699 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.853340 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.858302 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft"] Oct 09 11:10:30 crc kubenswrapper[4740]: I1009 11:10:30.933381 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" event={"ID":"40e8133a-5380-4983-a96f-8f28d50108a9","Type":"ContainerStarted","Data":"2f9db023ea2a3cba00bfb54057202587f9f1587cc33a9eca8e8ade29744a3a92"} Oct 09 11:10:31 crc kubenswrapper[4740]: I1009 11:10:31.946364 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" event={"ID":"40e8133a-5380-4983-a96f-8f28d50108a9","Type":"ContainerStarted","Data":"4fce1e8189010b8a4663f2e61a776dd7af0e329834badc9a23cc385d4acec8eb"} Oct 09 11:10:31 crc kubenswrapper[4740]: I1009 11:10:31.972407 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" podStartSLOduration=2.361319439 podStartE2EDuration="2.972387117s" podCreationTimestamp="2025-10-09 11:10:29 +0000 UTC" firstStartedPulling="2025-10-09 11:10:30.853122875 +0000 UTC m=+2569.815323256" lastFinishedPulling="2025-10-09 11:10:31.464190553 +0000 UTC m=+2570.426390934" observedRunningTime="2025-10-09 11:10:31.967445132 +0000 UTC m=+2570.929645503" watchObservedRunningTime="2025-10-09 11:10:31.972387117 +0000 UTC m=+2570.934587498" Oct 09 11:12:05 crc kubenswrapper[4740]: I1009 11:12:05.408962 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:12:05 crc kubenswrapper[4740]: I1009 11:12:05.410563 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:12:35 crc kubenswrapper[4740]: I1009 11:12:35.407310 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:12:35 crc kubenswrapper[4740]: I1009 11:12:35.407947 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:12:54 crc kubenswrapper[4740]: I1009 11:12:54.335593 4740 generic.go:334] "Generic (PLEG): container finished" podID="40e8133a-5380-4983-a96f-8f28d50108a9" containerID="4fce1e8189010b8a4663f2e61a776dd7af0e329834badc9a23cc385d4acec8eb" exitCode=0 Oct 09 11:12:54 crc kubenswrapper[4740]: I1009 11:12:54.335678 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" event={"ID":"40e8133a-5380-4983-a96f-8f28d50108a9","Type":"ContainerDied","Data":"4fce1e8189010b8a4663f2e61a776dd7af0e329834badc9a23cc385d4acec8eb"} Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.733722 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.880312 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-2\") pod \"40e8133a-5380-4983-a96f-8f28d50108a9\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.880390 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-inventory\") pod \"40e8133a-5380-4983-a96f-8f28d50108a9\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.880425 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqb6j\" (UniqueName: \"kubernetes.io/projected/40e8133a-5380-4983-a96f-8f28d50108a9-kube-api-access-zqb6j\") pod \"40e8133a-5380-4983-a96f-8f28d50108a9\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.880535 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-telemetry-combined-ca-bundle\") pod \"40e8133a-5380-4983-a96f-8f28d50108a9\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.880563 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-1\") pod \"40e8133a-5380-4983-a96f-8f28d50108a9\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.880616 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ssh-key\") pod \"40e8133a-5380-4983-a96f-8f28d50108a9\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.880647 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-0\") pod \"40e8133a-5380-4983-a96f-8f28d50108a9\" (UID: \"40e8133a-5380-4983-a96f-8f28d50108a9\") " Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.900111 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e8133a-5380-4983-a96f-8f28d50108a9-kube-api-access-zqb6j" (OuterVolumeSpecName: "kube-api-access-zqb6j") pod "40e8133a-5380-4983-a96f-8f28d50108a9" (UID: "40e8133a-5380-4983-a96f-8f28d50108a9"). InnerVolumeSpecName "kube-api-access-zqb6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.901294 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "40e8133a-5380-4983-a96f-8f28d50108a9" (UID: "40e8133a-5380-4983-a96f-8f28d50108a9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.910638 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "40e8133a-5380-4983-a96f-8f28d50108a9" (UID: "40e8133a-5380-4983-a96f-8f28d50108a9"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.916631 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-inventory" (OuterVolumeSpecName: "inventory") pod "40e8133a-5380-4983-a96f-8f28d50108a9" (UID: "40e8133a-5380-4983-a96f-8f28d50108a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.920012 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "40e8133a-5380-4983-a96f-8f28d50108a9" (UID: "40e8133a-5380-4983-a96f-8f28d50108a9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.931321 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "40e8133a-5380-4983-a96f-8f28d50108a9" (UID: "40e8133a-5380-4983-a96f-8f28d50108a9"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.957692 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "40e8133a-5380-4983-a96f-8f28d50108a9" (UID: "40e8133a-5380-4983-a96f-8f28d50108a9"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.987633 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.987669 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.987682 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqb6j\" (UniqueName: \"kubernetes.io/projected/40e8133a-5380-4983-a96f-8f28d50108a9-kube-api-access-zqb6j\") on node \"crc\" DevicePath \"\"" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.987692 4740 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.987701 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.987710 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 11:12:55 crc kubenswrapper[4740]: I1009 11:12:55.987721 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40e8133a-5380-4983-a96f-8f28d50108a9-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 09 11:12:56 crc kubenswrapper[4740]: I1009 11:12:56.358027 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" event={"ID":"40e8133a-5380-4983-a96f-8f28d50108a9","Type":"ContainerDied","Data":"2f9db023ea2a3cba00bfb54057202587f9f1587cc33a9eca8e8ade29744a3a92"} Oct 09 11:12:56 crc kubenswrapper[4740]: I1009 11:12:56.358069 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f9db023ea2a3cba00bfb54057202587f9f1587cc33a9eca8e8ade29744a3a92" Oct 09 11:12:56 crc kubenswrapper[4740]: I1009 11:12:56.358092 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ggkft" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.630546 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xhgj9"] Oct 09 11:13:01 crc kubenswrapper[4740]: E1009 11:13:01.631094 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e8133a-5380-4983-a96f-8f28d50108a9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.631108 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e8133a-5380-4983-a96f-8f28d50108a9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.631308 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e8133a-5380-4983-a96f-8f28d50108a9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.632763 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.650745 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhgj9"] Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.802628 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-utilities\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.802677 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7426\" (UniqueName: \"kubernetes.io/projected/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-kube-api-access-q7426\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.802716 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-catalog-content\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.904233 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-utilities\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.904282 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7426\" (UniqueName: \"kubernetes.io/projected/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-kube-api-access-q7426\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.904318 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-catalog-content\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.904840 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-utilities\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.904850 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-catalog-content\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.927607 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7426\" (UniqueName: \"kubernetes.io/projected/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-kube-api-access-q7426\") pod \"certified-operators-xhgj9\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:01 crc kubenswrapper[4740]: I1009 11:13:01.975381 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:02 crc kubenswrapper[4740]: I1009 11:13:02.489968 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhgj9"] Oct 09 11:13:03 crc kubenswrapper[4740]: I1009 11:13:03.436998 4740 generic.go:334] "Generic (PLEG): container finished" podID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerID="e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8" exitCode=0 Oct 09 11:13:03 crc kubenswrapper[4740]: I1009 11:13:03.437082 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhgj9" event={"ID":"16afaea2-f8d6-4e18-be3d-e11dd3b077f1","Type":"ContainerDied","Data":"e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8"} Oct 09 11:13:03 crc kubenswrapper[4740]: I1009 11:13:03.437338 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhgj9" event={"ID":"16afaea2-f8d6-4e18-be3d-e11dd3b077f1","Type":"ContainerStarted","Data":"10e1c9eda6ca946c5f1c71363ebdd7f501bce1ba0847bf8fcbf01be86839f6fa"} Oct 09 11:13:04 crc kubenswrapper[4740]: I1009 11:13:04.457448 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhgj9" event={"ID":"16afaea2-f8d6-4e18-be3d-e11dd3b077f1","Type":"ContainerStarted","Data":"bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e"} Oct 09 11:13:05 crc kubenswrapper[4740]: I1009 11:13:05.407859 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:13:05 crc kubenswrapper[4740]: I1009 11:13:05.408172 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:13:05 crc kubenswrapper[4740]: I1009 11:13:05.408236 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 11:13:05 crc kubenswrapper[4740]: I1009 11:13:05.409141 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f224417477db387e7ae5d61638574b08df79764d680b81f13a9b4e20caf519e"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 11:13:05 crc kubenswrapper[4740]: I1009 11:13:05.409233 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://4f224417477db387e7ae5d61638574b08df79764d680b81f13a9b4e20caf519e" gracePeriod=600 Oct 09 11:13:05 crc kubenswrapper[4740]: I1009 11:13:05.474089 4740 generic.go:334] "Generic (PLEG): container finished" podID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerID="bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e" exitCode=0 Oct 09 11:13:05 crc kubenswrapper[4740]: I1009 11:13:05.474138 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhgj9" event={"ID":"16afaea2-f8d6-4e18-be3d-e11dd3b077f1","Type":"ContainerDied","Data":"bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e"} Oct 09 11:13:06 crc kubenswrapper[4740]: I1009 11:13:06.484173 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="4f224417477db387e7ae5d61638574b08df79764d680b81f13a9b4e20caf519e" exitCode=0 Oct 09 11:13:06 crc kubenswrapper[4740]: I1009 11:13:06.484240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"4f224417477db387e7ae5d61638574b08df79764d680b81f13a9b4e20caf519e"} Oct 09 11:13:06 crc kubenswrapper[4740]: I1009 11:13:06.484880 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8"} Oct 09 11:13:06 crc kubenswrapper[4740]: I1009 11:13:06.484903 4740 scope.go:117] "RemoveContainer" containerID="a74648d49e1c893675c73537c284b80f75d99557d09d33a403aee9bb75421689" Oct 09 11:13:06 crc kubenswrapper[4740]: I1009 11:13:06.489978 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhgj9" event={"ID":"16afaea2-f8d6-4e18-be3d-e11dd3b077f1","Type":"ContainerStarted","Data":"fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9"} Oct 09 11:13:06 crc kubenswrapper[4740]: I1009 11:13:06.538168 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xhgj9" podStartSLOduration=2.8078766010000002 podStartE2EDuration="5.538138337s" podCreationTimestamp="2025-10-09 11:13:01 +0000 UTC" firstStartedPulling="2025-10-09 11:13:03.441686322 +0000 UTC m=+2722.403886713" lastFinishedPulling="2025-10-09 11:13:06.171948068 +0000 UTC m=+2725.134148449" observedRunningTime="2025-10-09 11:13:06.530318153 +0000 UTC m=+2725.492518534" watchObservedRunningTime="2025-10-09 11:13:06.538138337 +0000 UTC m=+2725.500338738" Oct 09 11:13:11 crc kubenswrapper[4740]: I1009 11:13:11.975637 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:11 crc kubenswrapper[4740]: I1009 11:13:11.976310 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:12 crc kubenswrapper[4740]: I1009 11:13:12.060615 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:12 crc kubenswrapper[4740]: I1009 11:13:12.630700 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:12 crc kubenswrapper[4740]: I1009 11:13:12.694237 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhgj9"] Oct 09 11:13:14 crc kubenswrapper[4740]: I1009 11:13:14.580806 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xhgj9" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerName="registry-server" containerID="cri-o://fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9" gracePeriod=2 Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.005896 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.189988 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-catalog-content\") pod \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.190154 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-utilities\") pod \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.190249 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7426\" (UniqueName: \"kubernetes.io/projected/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-kube-api-access-q7426\") pod \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\" (UID: \"16afaea2-f8d6-4e18-be3d-e11dd3b077f1\") " Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.190985 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-utilities" (OuterVolumeSpecName: "utilities") pod "16afaea2-f8d6-4e18-be3d-e11dd3b077f1" (UID: "16afaea2-f8d6-4e18-be3d-e11dd3b077f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.205118 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-kube-api-access-q7426" (OuterVolumeSpecName: "kube-api-access-q7426") pod "16afaea2-f8d6-4e18-be3d-e11dd3b077f1" (UID: "16afaea2-f8d6-4e18-be3d-e11dd3b077f1"). InnerVolumeSpecName "kube-api-access-q7426". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.232979 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16afaea2-f8d6-4e18-be3d-e11dd3b077f1" (UID: "16afaea2-f8d6-4e18-be3d-e11dd3b077f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.292392 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.292723 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.293020 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7426\" (UniqueName: \"kubernetes.io/projected/16afaea2-f8d6-4e18-be3d-e11dd3b077f1-kube-api-access-q7426\") on node \"crc\" DevicePath \"\"" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.593224 4740 generic.go:334] "Generic (PLEG): container finished" podID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerID="fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9" exitCode=0 Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.593299 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhgj9" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.593284 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhgj9" event={"ID":"16afaea2-f8d6-4e18-be3d-e11dd3b077f1","Type":"ContainerDied","Data":"fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9"} Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.593710 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhgj9" event={"ID":"16afaea2-f8d6-4e18-be3d-e11dd3b077f1","Type":"ContainerDied","Data":"10e1c9eda6ca946c5f1c71363ebdd7f501bce1ba0847bf8fcbf01be86839f6fa"} Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.593737 4740 scope.go:117] "RemoveContainer" containerID="fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.616142 4740 scope.go:117] "RemoveContainer" containerID="bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.640085 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhgj9"] Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.648840 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xhgj9"] Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.650875 4740 scope.go:117] "RemoveContainer" containerID="e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.687251 4740 scope.go:117] "RemoveContainer" containerID="fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9" Oct 09 11:13:15 crc kubenswrapper[4740]: E1009 11:13:15.687831 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9\": container with ID starting with fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9 not found: ID does not exist" containerID="fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.688079 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9"} err="failed to get container status \"fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9\": rpc error: code = NotFound desc = could not find container \"fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9\": container with ID starting with fa16667c119fa645edf0e6ce59bc9996778cc3003d840700aab863e077bf6ef9 not found: ID does not exist" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.688224 4740 scope.go:117] "RemoveContainer" containerID="bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e" Oct 09 11:13:15 crc kubenswrapper[4740]: E1009 11:13:15.688830 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e\": container with ID starting with bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e not found: ID does not exist" containerID="bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.688893 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e"} err="failed to get container status \"bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e\": rpc error: code = NotFound desc = could not find container \"bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e\": container with ID starting with bb47f67cd1c4e0000f3acb7ef9d13dc0cd3a47ff71b031f07f4d87ef0043081e not found: ID does not exist" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.688937 4740 scope.go:117] "RemoveContainer" containerID="e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8" Oct 09 11:13:15 crc kubenswrapper[4740]: E1009 11:13:15.689340 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8\": container with ID starting with e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8 not found: ID does not exist" containerID="e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.689406 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8"} err="failed to get container status \"e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8\": rpc error: code = NotFound desc = could not find container \"e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8\": container with ID starting with e41c9d5e0b6434c86b7f6f48f09bf0860d0822c09b9aa6a9e20e65f7424de2d8 not found: ID does not exist" Oct 09 11:13:15 crc kubenswrapper[4740]: I1009 11:13:15.769380 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" path="/var/lib/kubelet/pods/16afaea2-f8d6-4e18-be3d-e11dd3b077f1/volumes" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.006357 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 11:13:42 crc kubenswrapper[4740]: E1009 11:13:42.009020 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerName="extract-utilities" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.009115 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerName="extract-utilities" Oct 09 11:13:42 crc kubenswrapper[4740]: E1009 11:13:42.009204 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerName="extract-content" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.009260 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerName="extract-content" Oct 09 11:13:42 crc kubenswrapper[4740]: E1009 11:13:42.009337 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerName="registry-server" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.009395 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerName="registry-server" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.009647 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="16afaea2-f8d6-4e18-be3d-e11dd3b077f1" containerName="registry-server" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.010482 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.012963 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.013120 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.013235 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6zqq5" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.013415 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.016808 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135055 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135133 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-config-data\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135346 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135494 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135518 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135546 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135573 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135676 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgskx\" (UniqueName: \"kubernetes.io/projected/4c1a2aba-0872-4bef-9bad-0ba37788423d-kube-api-access-dgskx\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.135710 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.237645 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.237796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-config-data\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.237871 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.237952 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.237982 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.238017 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.238049 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.238085 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgskx\" (UniqueName: \"kubernetes.io/projected/4c1a2aba-0872-4bef-9bad-0ba37788423d-kube-api-access-dgskx\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.238112 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.238957 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.239143 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.239259 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.239526 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.241841 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-config-data\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.245006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.245245 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.247131 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.259873 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgskx\" (UniqueName: \"kubernetes.io/projected/4c1a2aba-0872-4bef-9bad-0ba37788423d-kube-api-access-dgskx\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.268647 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.347189 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.812061 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 11:13:42 crc kubenswrapper[4740]: W1009 11:13:42.819056 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c1a2aba_0872_4bef_9bad_0ba37788423d.slice/crio-914ffb10720733b1eafe6397d2f881501953d964707b9afbdd8841fc8d96ed25 WatchSource:0}: Error finding container 914ffb10720733b1eafe6397d2f881501953d964707b9afbdd8841fc8d96ed25: Status 404 returned error can't find the container with id 914ffb10720733b1eafe6397d2f881501953d964707b9afbdd8841fc8d96ed25 Oct 09 11:13:42 crc kubenswrapper[4740]: I1009 11:13:42.903312 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4c1a2aba-0872-4bef-9bad-0ba37788423d","Type":"ContainerStarted","Data":"914ffb10720733b1eafe6397d2f881501953d964707b9afbdd8841fc8d96ed25"} Oct 09 11:14:09 crc kubenswrapper[4740]: E1009 11:14:09.443218 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 09 11:14:09 crc kubenswrapper[4740]: E1009 11:14:09.443874 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dgskx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(4c1a2aba-0872-4bef-9bad-0ba37788423d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 11:14:09 crc kubenswrapper[4740]: E1009 11:14:09.445312 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="4c1a2aba-0872-4bef-9bad-0ba37788423d" Oct 09 11:14:10 crc kubenswrapper[4740]: E1009 11:14:10.162799 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="4c1a2aba-0872-4bef-9bad-0ba37788423d" Oct 09 11:14:24 crc kubenswrapper[4740]: I1009 11:14:24.321300 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4c1a2aba-0872-4bef-9bad-0ba37788423d","Type":"ContainerStarted","Data":"c8f27d502ab7f6756924b00c462816148003392ebd595d74ba4d2e028498dcde"} Oct 09 11:14:24 crc kubenswrapper[4740]: I1009 11:14:24.345771 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.9115181119999995 podStartE2EDuration="44.345738108s" podCreationTimestamp="2025-10-09 11:13:40 +0000 UTC" firstStartedPulling="2025-10-09 11:13:42.822410557 +0000 UTC m=+2761.784610928" lastFinishedPulling="2025-10-09 11:14:22.256630503 +0000 UTC m=+2801.218830924" observedRunningTime="2025-10-09 11:14:24.339415875 +0000 UTC m=+2803.301616256" watchObservedRunningTime="2025-10-09 11:14:24.345738108 +0000 UTC m=+2803.307938489" Oct 09 11:14:54 crc kubenswrapper[4740]: I1009 11:14:54.950386 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4ttxs"] Oct 09 11:14:54 crc kubenswrapper[4740]: I1009 11:14:54.954905 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:54 crc kubenswrapper[4740]: I1009 11:14:54.981430 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ttxs"] Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.109156 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-utilities\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.109455 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-catalog-content\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.109648 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd52k\" (UniqueName: \"kubernetes.io/projected/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-kube-api-access-rd52k\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.210870 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-utilities\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.210924 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-catalog-content\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.211027 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd52k\" (UniqueName: \"kubernetes.io/projected/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-kube-api-access-rd52k\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.211975 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-catalog-content\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.212231 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-utilities\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.238489 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd52k\" (UniqueName: \"kubernetes.io/projected/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-kube-api-access-rd52k\") pod \"redhat-marketplace-4ttxs\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.283883 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:14:55 crc kubenswrapper[4740]: W1009 11:14:55.743183 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5869a157_1ef6_4edd_a7c7_ca863b07a8ad.slice/crio-9b23e3d4ddf1b3c0ef63da34f8401196fee61f6392068591e73bfcbfb39a421c WatchSource:0}: Error finding container 9b23e3d4ddf1b3c0ef63da34f8401196fee61f6392068591e73bfcbfb39a421c: Status 404 returned error can't find the container with id 9b23e3d4ddf1b3c0ef63da34f8401196fee61f6392068591e73bfcbfb39a421c Oct 09 11:14:55 crc kubenswrapper[4740]: I1009 11:14:55.744401 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ttxs"] Oct 09 11:14:56 crc kubenswrapper[4740]: I1009 11:14:56.680734 4740 generic.go:334] "Generic (PLEG): container finished" podID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerID="6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774" exitCode=0 Oct 09 11:14:56 crc kubenswrapper[4740]: I1009 11:14:56.680846 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ttxs" event={"ID":"5869a157-1ef6-4edd-a7c7-ca863b07a8ad","Type":"ContainerDied","Data":"6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774"} Oct 09 11:14:56 crc kubenswrapper[4740]: I1009 11:14:56.680919 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ttxs" event={"ID":"5869a157-1ef6-4edd-a7c7-ca863b07a8ad","Type":"ContainerStarted","Data":"9b23e3d4ddf1b3c0ef63da34f8401196fee61f6392068591e73bfcbfb39a421c"} Oct 09 11:14:57 crc kubenswrapper[4740]: I1009 11:14:57.691326 4740 generic.go:334] "Generic (PLEG): container finished" podID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerID="74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361" exitCode=0 Oct 09 11:14:57 crc kubenswrapper[4740]: I1009 11:14:57.691381 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ttxs" event={"ID":"5869a157-1ef6-4edd-a7c7-ca863b07a8ad","Type":"ContainerDied","Data":"74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361"} Oct 09 11:14:58 crc kubenswrapper[4740]: I1009 11:14:58.706200 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ttxs" event={"ID":"5869a157-1ef6-4edd-a7c7-ca863b07a8ad","Type":"ContainerStarted","Data":"4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90"} Oct 09 11:14:58 crc kubenswrapper[4740]: I1009 11:14:58.737208 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4ttxs" podStartSLOduration=3.260871233 podStartE2EDuration="4.737181753s" podCreationTimestamp="2025-10-09 11:14:54 +0000 UTC" firstStartedPulling="2025-10-09 11:14:56.683147717 +0000 UTC m=+2835.645348138" lastFinishedPulling="2025-10-09 11:14:58.159458247 +0000 UTC m=+2837.121658658" observedRunningTime="2025-10-09 11:14:58.729663237 +0000 UTC m=+2837.691863618" watchObservedRunningTime="2025-10-09 11:14:58.737181753 +0000 UTC m=+2837.699382144" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.177524 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b"] Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.179650 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.186451 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.186557 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.208628 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60e0cc-224f-4208-af75-113f48786a87-config-volume\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.208692 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60e0cc-224f-4208-af75-113f48786a87-secret-volume\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.208723 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5gr2\" (UniqueName: \"kubernetes.io/projected/3e60e0cc-224f-4208-af75-113f48786a87-kube-api-access-s5gr2\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.216350 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b"] Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.311329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60e0cc-224f-4208-af75-113f48786a87-config-volume\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.311390 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5gr2\" (UniqueName: \"kubernetes.io/projected/3e60e0cc-224f-4208-af75-113f48786a87-kube-api-access-s5gr2\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.311416 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60e0cc-224f-4208-af75-113f48786a87-secret-volume\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.312569 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60e0cc-224f-4208-af75-113f48786a87-config-volume\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.319514 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60e0cc-224f-4208-af75-113f48786a87-secret-volume\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.330126 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5gr2\" (UniqueName: \"kubernetes.io/projected/3e60e0cc-224f-4208-af75-113f48786a87-kube-api-access-s5gr2\") pod \"collect-profiles-29333475-dzs2b\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.514011 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:00 crc kubenswrapper[4740]: I1009 11:15:00.975540 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b"] Oct 09 11:15:00 crc kubenswrapper[4740]: W1009 11:15:00.985511 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e60e0cc_224f_4208_af75_113f48786a87.slice/crio-33816a68645ad79c566de7aebbdb41835e18a0b37c85fe14323d4bb7884efff0 WatchSource:0}: Error finding container 33816a68645ad79c566de7aebbdb41835e18a0b37c85fe14323d4bb7884efff0: Status 404 returned error can't find the container with id 33816a68645ad79c566de7aebbdb41835e18a0b37c85fe14323d4bb7884efff0 Oct 09 11:15:01 crc kubenswrapper[4740]: I1009 11:15:01.742492 4740 generic.go:334] "Generic (PLEG): container finished" podID="3e60e0cc-224f-4208-af75-113f48786a87" containerID="cbf848c93730e50b964d0e7c33138095fa0818b9d62b2e43ae8c7518a0e50adc" exitCode=0 Oct 09 11:15:01 crc kubenswrapper[4740]: I1009 11:15:01.742569 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" event={"ID":"3e60e0cc-224f-4208-af75-113f48786a87","Type":"ContainerDied","Data":"cbf848c93730e50b964d0e7c33138095fa0818b9d62b2e43ae8c7518a0e50adc"} Oct 09 11:15:01 crc kubenswrapper[4740]: I1009 11:15:01.742649 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" event={"ID":"3e60e0cc-224f-4208-af75-113f48786a87","Type":"ContainerStarted","Data":"33816a68645ad79c566de7aebbdb41835e18a0b37c85fe14323d4bb7884efff0"} Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.149819 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.267184 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60e0cc-224f-4208-af75-113f48786a87-secret-volume\") pod \"3e60e0cc-224f-4208-af75-113f48786a87\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.267363 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60e0cc-224f-4208-af75-113f48786a87-config-volume\") pod \"3e60e0cc-224f-4208-af75-113f48786a87\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.267426 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5gr2\" (UniqueName: \"kubernetes.io/projected/3e60e0cc-224f-4208-af75-113f48786a87-kube-api-access-s5gr2\") pod \"3e60e0cc-224f-4208-af75-113f48786a87\" (UID: \"3e60e0cc-224f-4208-af75-113f48786a87\") " Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.268219 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e60e0cc-224f-4208-af75-113f48786a87-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e60e0cc-224f-4208-af75-113f48786a87" (UID: "3e60e0cc-224f-4208-af75-113f48786a87"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.274311 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e60e0cc-224f-4208-af75-113f48786a87-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e60e0cc-224f-4208-af75-113f48786a87" (UID: "3e60e0cc-224f-4208-af75-113f48786a87"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.279688 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e60e0cc-224f-4208-af75-113f48786a87-kube-api-access-s5gr2" (OuterVolumeSpecName: "kube-api-access-s5gr2") pod "3e60e0cc-224f-4208-af75-113f48786a87" (UID: "3e60e0cc-224f-4208-af75-113f48786a87"). InnerVolumeSpecName "kube-api-access-s5gr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.369890 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5gr2\" (UniqueName: \"kubernetes.io/projected/3e60e0cc-224f-4208-af75-113f48786a87-kube-api-access-s5gr2\") on node \"crc\" DevicePath \"\"" Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.369936 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60e0cc-224f-4208-af75-113f48786a87-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.369949 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60e0cc-224f-4208-af75-113f48786a87-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.763674 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.772875 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333475-dzs2b" event={"ID":"3e60e0cc-224f-4208-af75-113f48786a87","Type":"ContainerDied","Data":"33816a68645ad79c566de7aebbdb41835e18a0b37c85fe14323d4bb7884efff0"} Oct 09 11:15:03 crc kubenswrapper[4740]: I1009 11:15:03.772921 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33816a68645ad79c566de7aebbdb41835e18a0b37c85fe14323d4bb7884efff0" Oct 09 11:15:04 crc kubenswrapper[4740]: I1009 11:15:04.236306 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf"] Oct 09 11:15:04 crc kubenswrapper[4740]: I1009 11:15:04.243986 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333430-lmrlf"] Oct 09 11:15:05 crc kubenswrapper[4740]: I1009 11:15:05.284895 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:15:05 crc kubenswrapper[4740]: I1009 11:15:05.286045 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:15:05 crc kubenswrapper[4740]: I1009 11:15:05.341701 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:15:05 crc kubenswrapper[4740]: I1009 11:15:05.407829 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:15:05 crc kubenswrapper[4740]: I1009 11:15:05.407898 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:15:05 crc kubenswrapper[4740]: I1009 11:15:05.769143 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4411b16-07f8-4701-ad4f-7645a00e829f" path="/var/lib/kubelet/pods/e4411b16-07f8-4701-ad4f-7645a00e829f/volumes" Oct 09 11:15:05 crc kubenswrapper[4740]: I1009 11:15:05.828174 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:15:06 crc kubenswrapper[4740]: I1009 11:15:06.944655 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ttxs"] Oct 09 11:15:08 crc kubenswrapper[4740]: I1009 11:15:08.809423 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4ttxs" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerName="registry-server" containerID="cri-o://4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90" gracePeriod=2 Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.263702 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.390264 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-catalog-content\") pod \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.390367 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-utilities\") pod \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.390429 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd52k\" (UniqueName: \"kubernetes.io/projected/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-kube-api-access-rd52k\") pod \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\" (UID: \"5869a157-1ef6-4edd-a7c7-ca863b07a8ad\") " Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.391147 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-utilities" (OuterVolumeSpecName: "utilities") pod "5869a157-1ef6-4edd-a7c7-ca863b07a8ad" (UID: "5869a157-1ef6-4edd-a7c7-ca863b07a8ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.395955 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-kube-api-access-rd52k" (OuterVolumeSpecName: "kube-api-access-rd52k") pod "5869a157-1ef6-4edd-a7c7-ca863b07a8ad" (UID: "5869a157-1ef6-4edd-a7c7-ca863b07a8ad"). InnerVolumeSpecName "kube-api-access-rd52k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.405085 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5869a157-1ef6-4edd-a7c7-ca863b07a8ad" (UID: "5869a157-1ef6-4edd-a7c7-ca863b07a8ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.492360 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.492394 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.492404 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd52k\" (UniqueName: \"kubernetes.io/projected/5869a157-1ef6-4edd-a7c7-ca863b07a8ad-kube-api-access-rd52k\") on node \"crc\" DevicePath \"\"" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.823743 4740 generic.go:334] "Generic (PLEG): container finished" podID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerID="4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90" exitCode=0 Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.823843 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ttxs" event={"ID":"5869a157-1ef6-4edd-a7c7-ca863b07a8ad","Type":"ContainerDied","Data":"4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90"} Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.823864 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ttxs" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.824039 4740 scope.go:117] "RemoveContainer" containerID="4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.823890 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ttxs" event={"ID":"5869a157-1ef6-4edd-a7c7-ca863b07a8ad","Type":"ContainerDied","Data":"9b23e3d4ddf1b3c0ef63da34f8401196fee61f6392068591e73bfcbfb39a421c"} Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.849438 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ttxs"] Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.856982 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ttxs"] Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.861057 4740 scope.go:117] "RemoveContainer" containerID="74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.890924 4740 scope.go:117] "RemoveContainer" containerID="6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.960416 4740 scope.go:117] "RemoveContainer" containerID="4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90" Oct 09 11:15:09 crc kubenswrapper[4740]: E1009 11:15:09.960839 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90\": container with ID starting with 4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90 not found: ID does not exist" containerID="4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.960875 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90"} err="failed to get container status \"4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90\": rpc error: code = NotFound desc = could not find container \"4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90\": container with ID starting with 4e5a9e17c16abff2eb0d3f553ebffd8eac596399156c21a01636e8e5b584bf90 not found: ID does not exist" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.960903 4740 scope.go:117] "RemoveContainer" containerID="74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361" Oct 09 11:15:09 crc kubenswrapper[4740]: E1009 11:15:09.961156 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361\": container with ID starting with 74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361 not found: ID does not exist" containerID="74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.961179 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361"} err="failed to get container status \"74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361\": rpc error: code = NotFound desc = could not find container \"74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361\": container with ID starting with 74662db2eaca938f21bf0b6f0dc9a9fca779af4349ff67b35e0029bb20e21361 not found: ID does not exist" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.961191 4740 scope.go:117] "RemoveContainer" containerID="6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774" Oct 09 11:15:09 crc kubenswrapper[4740]: E1009 11:15:09.961371 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774\": container with ID starting with 6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774 not found: ID does not exist" containerID="6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774" Oct 09 11:15:09 crc kubenswrapper[4740]: I1009 11:15:09.961391 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774"} err="failed to get container status \"6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774\": rpc error: code = NotFound desc = could not find container \"6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774\": container with ID starting with 6708ac1b1c99a7832ad77b76e628f1d1b155e1ff6d49230779463379bf414774 not found: ID does not exist" Oct 09 11:15:11 crc kubenswrapper[4740]: I1009 11:15:11.767345 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" path="/var/lib/kubelet/pods/5869a157-1ef6-4edd-a7c7-ca863b07a8ad/volumes" Oct 09 11:15:12 crc kubenswrapper[4740]: I1009 11:15:12.642330 4740 scope.go:117] "RemoveContainer" containerID="c870180cedd7934f886a0c284ef2c75616503f107aa0fac00c294108d4f2996a" Oct 09 11:15:35 crc kubenswrapper[4740]: I1009 11:15:35.408152 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:15:35 crc kubenswrapper[4740]: I1009 11:15:35.409256 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:16:05 crc kubenswrapper[4740]: I1009 11:16:05.407656 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:16:05 crc kubenswrapper[4740]: I1009 11:16:05.408321 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:16:05 crc kubenswrapper[4740]: I1009 11:16:05.408377 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 11:16:05 crc kubenswrapper[4740]: I1009 11:16:05.409395 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 11:16:05 crc kubenswrapper[4740]: I1009 11:16:05.409492 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" gracePeriod=600 Oct 09 11:16:05 crc kubenswrapper[4740]: E1009 11:16:05.534257 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:16:06 crc kubenswrapper[4740]: I1009 11:16:06.388478 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" exitCode=0 Oct 09 11:16:06 crc kubenswrapper[4740]: I1009 11:16:06.388517 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8"} Oct 09 11:16:06 crc kubenswrapper[4740]: I1009 11:16:06.388546 4740 scope.go:117] "RemoveContainer" containerID="4f224417477db387e7ae5d61638574b08df79764d680b81f13a9b4e20caf519e" Oct 09 11:16:06 crc kubenswrapper[4740]: I1009 11:16:06.389213 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:16:06 crc kubenswrapper[4740]: E1009 11:16:06.389596 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:16:17 crc kubenswrapper[4740]: I1009 11:16:17.753533 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:16:17 crc kubenswrapper[4740]: E1009 11:16:17.754392 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:16:32 crc kubenswrapper[4740]: I1009 11:16:32.754040 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:16:32 crc kubenswrapper[4740]: E1009 11:16:32.756188 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:16:45 crc kubenswrapper[4740]: I1009 11:16:45.755987 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:16:45 crc kubenswrapper[4740]: E1009 11:16:45.757034 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.899312 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvcdp"] Oct 09 11:16:50 crc kubenswrapper[4740]: E1009 11:16:50.900468 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e60e0cc-224f-4208-af75-113f48786a87" containerName="collect-profiles" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.900492 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e60e0cc-224f-4208-af75-113f48786a87" containerName="collect-profiles" Oct 09 11:16:50 crc kubenswrapper[4740]: E1009 11:16:50.900516 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerName="registry-server" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.900527 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerName="registry-server" Oct 09 11:16:50 crc kubenswrapper[4740]: E1009 11:16:50.900547 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerName="extract-content" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.900558 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerName="extract-content" Oct 09 11:16:50 crc kubenswrapper[4740]: E1009 11:16:50.900609 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerName="extract-utilities" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.900623 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerName="extract-utilities" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.900925 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e60e0cc-224f-4208-af75-113f48786a87" containerName="collect-profiles" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.900958 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5869a157-1ef6-4edd-a7c7-ca863b07a8ad" containerName="registry-server" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.907841 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:50 crc kubenswrapper[4740]: I1009 11:16:50.912979 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvcdp"] Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.070617 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-catalog-content\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.070685 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-utilities\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.070718 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zmqw\" (UniqueName: \"kubernetes.io/projected/53976072-61a6-4516-85c7-69eaab1f122f-kube-api-access-7zmqw\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.173048 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-catalog-content\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.173113 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-utilities\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.173145 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zmqw\" (UniqueName: \"kubernetes.io/projected/53976072-61a6-4516-85c7-69eaab1f122f-kube-api-access-7zmqw\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.173657 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-catalog-content\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.173712 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-utilities\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.205045 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zmqw\" (UniqueName: \"kubernetes.io/projected/53976072-61a6-4516-85c7-69eaab1f122f-kube-api-access-7zmqw\") pod \"redhat-operators-wvcdp\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.246684 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.701975 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvcdp"] Oct 09 11:16:51 crc kubenswrapper[4740]: I1009 11:16:51.827450 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcdp" event={"ID":"53976072-61a6-4516-85c7-69eaab1f122f","Type":"ContainerStarted","Data":"e28126dcc4fd8602477d9c16aa8af20e5d25de4bf7ccc5c040e431bd95f5db7a"} Oct 09 11:16:52 crc kubenswrapper[4740]: I1009 11:16:52.843231 4740 generic.go:334] "Generic (PLEG): container finished" podID="53976072-61a6-4516-85c7-69eaab1f122f" containerID="771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9" exitCode=0 Oct 09 11:16:52 crc kubenswrapper[4740]: I1009 11:16:52.843319 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcdp" event={"ID":"53976072-61a6-4516-85c7-69eaab1f122f","Type":"ContainerDied","Data":"771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9"} Oct 09 11:16:52 crc kubenswrapper[4740]: I1009 11:16:52.847269 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 11:16:54 crc kubenswrapper[4740]: I1009 11:16:54.864560 4740 generic.go:334] "Generic (PLEG): container finished" podID="53976072-61a6-4516-85c7-69eaab1f122f" containerID="157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91" exitCode=0 Oct 09 11:16:54 crc kubenswrapper[4740]: I1009 11:16:54.864670 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcdp" event={"ID":"53976072-61a6-4516-85c7-69eaab1f122f","Type":"ContainerDied","Data":"157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91"} Oct 09 11:16:56 crc kubenswrapper[4740]: I1009 11:16:56.903152 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcdp" event={"ID":"53976072-61a6-4516-85c7-69eaab1f122f","Type":"ContainerStarted","Data":"0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97"} Oct 09 11:16:56 crc kubenswrapper[4740]: I1009 11:16:56.933391 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvcdp" podStartSLOduration=4.324505849 podStartE2EDuration="6.933373293s" podCreationTimestamp="2025-10-09 11:16:50 +0000 UTC" firstStartedPulling="2025-10-09 11:16:52.846713798 +0000 UTC m=+2951.808914219" lastFinishedPulling="2025-10-09 11:16:55.455581272 +0000 UTC m=+2954.417781663" observedRunningTime="2025-10-09 11:16:56.929462406 +0000 UTC m=+2955.891662797" watchObservedRunningTime="2025-10-09 11:16:56.933373293 +0000 UTC m=+2955.895573684" Oct 09 11:16:58 crc kubenswrapper[4740]: I1009 11:16:58.754985 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:16:58 crc kubenswrapper[4740]: E1009 11:16:58.755744 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:17:01 crc kubenswrapper[4740]: I1009 11:17:01.247445 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:17:01 crc kubenswrapper[4740]: I1009 11:17:01.247877 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:17:01 crc kubenswrapper[4740]: I1009 11:17:01.327882 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:17:02 crc kubenswrapper[4740]: I1009 11:17:02.040737 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:17:02 crc kubenswrapper[4740]: I1009 11:17:02.099566 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvcdp"] Oct 09 11:17:03 crc kubenswrapper[4740]: I1009 11:17:03.977531 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvcdp" podUID="53976072-61a6-4516-85c7-69eaab1f122f" containerName="registry-server" containerID="cri-o://0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97" gracePeriod=2 Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.491196 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.557649 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zmqw\" (UniqueName: \"kubernetes.io/projected/53976072-61a6-4516-85c7-69eaab1f122f-kube-api-access-7zmqw\") pod \"53976072-61a6-4516-85c7-69eaab1f122f\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.557863 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-catalog-content\") pod \"53976072-61a6-4516-85c7-69eaab1f122f\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.557931 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-utilities\") pod \"53976072-61a6-4516-85c7-69eaab1f122f\" (UID: \"53976072-61a6-4516-85c7-69eaab1f122f\") " Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.558828 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-utilities" (OuterVolumeSpecName: "utilities") pod "53976072-61a6-4516-85c7-69eaab1f122f" (UID: "53976072-61a6-4516-85c7-69eaab1f122f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.565481 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53976072-61a6-4516-85c7-69eaab1f122f-kube-api-access-7zmqw" (OuterVolumeSpecName: "kube-api-access-7zmqw") pod "53976072-61a6-4516-85c7-69eaab1f122f" (UID: "53976072-61a6-4516-85c7-69eaab1f122f"). InnerVolumeSpecName "kube-api-access-7zmqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.638039 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53976072-61a6-4516-85c7-69eaab1f122f" (UID: "53976072-61a6-4516-85c7-69eaab1f122f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.660492 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.660520 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53976072-61a6-4516-85c7-69eaab1f122f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.660533 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zmqw\" (UniqueName: \"kubernetes.io/projected/53976072-61a6-4516-85c7-69eaab1f122f-kube-api-access-7zmqw\") on node \"crc\" DevicePath \"\"" Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.987894 4740 generic.go:334] "Generic (PLEG): container finished" podID="53976072-61a6-4516-85c7-69eaab1f122f" containerID="0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97" exitCode=0 Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.987942 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcdp" event={"ID":"53976072-61a6-4516-85c7-69eaab1f122f","Type":"ContainerDied","Data":"0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97"} Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.987974 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcdp" event={"ID":"53976072-61a6-4516-85c7-69eaab1f122f","Type":"ContainerDied","Data":"e28126dcc4fd8602477d9c16aa8af20e5d25de4bf7ccc5c040e431bd95f5db7a"} Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.987991 4740 scope.go:117] "RemoveContainer" containerID="0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97" Oct 09 11:17:04 crc kubenswrapper[4740]: I1009 11:17:04.988113 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcdp" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.030427 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvcdp"] Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.033226 4740 scope.go:117] "RemoveContainer" containerID="157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.038974 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvcdp"] Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.059728 4740 scope.go:117] "RemoveContainer" containerID="771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.116140 4740 scope.go:117] "RemoveContainer" containerID="0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97" Oct 09 11:17:05 crc kubenswrapper[4740]: E1009 11:17:05.116609 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97\": container with ID starting with 0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97 not found: ID does not exist" containerID="0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.116636 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97"} err="failed to get container status \"0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97\": rpc error: code = NotFound desc = could not find container \"0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97\": container with ID starting with 0a5f1d244d4605432fcf772782abcdad1e8d7f4338c9bef556b4180c7bf5eb97 not found: ID does not exist" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.116656 4740 scope.go:117] "RemoveContainer" containerID="157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91" Oct 09 11:17:05 crc kubenswrapper[4740]: E1009 11:17:05.117007 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91\": container with ID starting with 157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91 not found: ID does not exist" containerID="157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.117124 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91"} err="failed to get container status \"157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91\": rpc error: code = NotFound desc = could not find container \"157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91\": container with ID starting with 157c44e9dfe696b2845ed6d1e6689b9b47b357a29c64c5062261b80427d58c91 not found: ID does not exist" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.117219 4740 scope.go:117] "RemoveContainer" containerID="771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9" Oct 09 11:17:05 crc kubenswrapper[4740]: E1009 11:17:05.117588 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9\": container with ID starting with 771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9 not found: ID does not exist" containerID="771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.117730 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9"} err="failed to get container status \"771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9\": rpc error: code = NotFound desc = could not find container \"771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9\": container with ID starting with 771541087e942c5b48950d12202462acb06b2ea8e9e1e4e7d4a0b132ab63d3d9 not found: ID does not exist" Oct 09 11:17:05 crc kubenswrapper[4740]: E1009 11:17:05.224556 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53976072_61a6_4516_85c7_69eaab1f122f.slice/crio-e28126dcc4fd8602477d9c16aa8af20e5d25de4bf7ccc5c040e431bd95f5db7a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53976072_61a6_4516_85c7_69eaab1f122f.slice\": RecentStats: unable to find data in memory cache]" Oct 09 11:17:05 crc kubenswrapper[4740]: I1009 11:17:05.764826 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53976072-61a6-4516-85c7-69eaab1f122f" path="/var/lib/kubelet/pods/53976072-61a6-4516-85c7-69eaab1f122f/volumes" Oct 09 11:17:10 crc kubenswrapper[4740]: I1009 11:17:10.754083 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:17:10 crc kubenswrapper[4740]: E1009 11:17:10.755147 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:17:23 crc kubenswrapper[4740]: I1009 11:17:23.754447 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:17:23 crc kubenswrapper[4740]: E1009 11:17:23.755476 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:17:35 crc kubenswrapper[4740]: I1009 11:17:35.753564 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:17:35 crc kubenswrapper[4740]: E1009 11:17:35.754294 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:17:48 crc kubenswrapper[4740]: I1009 11:17:48.753301 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:17:48 crc kubenswrapper[4740]: E1009 11:17:48.754806 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:18:01 crc kubenswrapper[4740]: I1009 11:18:01.759788 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:18:01 crc kubenswrapper[4740]: E1009 11:18:01.760581 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:18:12 crc kubenswrapper[4740]: I1009 11:18:12.753824 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:18:12 crc kubenswrapper[4740]: E1009 11:18:12.755493 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:18:25 crc kubenswrapper[4740]: I1009 11:18:25.753803 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:18:25 crc kubenswrapper[4740]: E1009 11:18:25.755060 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:18:37 crc kubenswrapper[4740]: I1009 11:18:37.754507 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:18:37 crc kubenswrapper[4740]: E1009 11:18:37.755979 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:18:49 crc kubenswrapper[4740]: I1009 11:18:49.754208 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:18:49 crc kubenswrapper[4740]: E1009 11:18:49.755134 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:19:02 crc kubenswrapper[4740]: I1009 11:19:02.754742 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:19:02 crc kubenswrapper[4740]: E1009 11:19:02.756020 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:19:13 crc kubenswrapper[4740]: I1009 11:19:13.754038 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:19:13 crc kubenswrapper[4740]: E1009 11:19:13.755080 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:19:24 crc kubenswrapper[4740]: I1009 11:19:24.754118 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:19:24 crc kubenswrapper[4740]: E1009 11:19:24.754944 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:19:36 crc kubenswrapper[4740]: I1009 11:19:36.753861 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:19:36 crc kubenswrapper[4740]: E1009 11:19:36.754821 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:19:51 crc kubenswrapper[4740]: I1009 11:19:51.777549 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:19:51 crc kubenswrapper[4740]: E1009 11:19:51.778713 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:20:06 crc kubenswrapper[4740]: I1009 11:20:06.754562 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:20:06 crc kubenswrapper[4740]: E1009 11:20:06.756080 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:20:17 crc kubenswrapper[4740]: I1009 11:20:17.753814 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:20:17 crc kubenswrapper[4740]: E1009 11:20:17.754949 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:20:32 crc kubenswrapper[4740]: I1009 11:20:32.754334 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:20:32 crc kubenswrapper[4740]: E1009 11:20:32.755977 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:20:43 crc kubenswrapper[4740]: I1009 11:20:43.754220 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:20:43 crc kubenswrapper[4740]: E1009 11:20:43.755025 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:20:58 crc kubenswrapper[4740]: I1009 11:20:58.754450 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:20:58 crc kubenswrapper[4740]: E1009 11:20:58.756126 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:21:13 crc kubenswrapper[4740]: I1009 11:21:13.754047 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:21:14 crc kubenswrapper[4740]: I1009 11:21:14.502741 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"51918591255fd9bc84070769a7fb279ef80f14e662d5784322c40cfbec726c46"} Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.305015 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zcm4"] Oct 09 11:23:04 crc kubenswrapper[4740]: E1009 11:23:04.306033 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53976072-61a6-4516-85c7-69eaab1f122f" containerName="extract-content" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.306048 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="53976072-61a6-4516-85c7-69eaab1f122f" containerName="extract-content" Oct 09 11:23:04 crc kubenswrapper[4740]: E1009 11:23:04.306065 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53976072-61a6-4516-85c7-69eaab1f122f" containerName="extract-utilities" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.306072 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="53976072-61a6-4516-85c7-69eaab1f122f" containerName="extract-utilities" Oct 09 11:23:04 crc kubenswrapper[4740]: E1009 11:23:04.306095 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53976072-61a6-4516-85c7-69eaab1f122f" containerName="registry-server" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.306102 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="53976072-61a6-4516-85c7-69eaab1f122f" containerName="registry-server" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.306296 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="53976072-61a6-4516-85c7-69eaab1f122f" containerName="registry-server" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.307998 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.321078 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zcm4"] Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.338340 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-utilities\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.338397 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pzfq\" (UniqueName: \"kubernetes.io/projected/e3ae7a40-4b21-490c-9964-14ccce5d0df9-kube-api-access-7pzfq\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.338518 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-catalog-content\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.439947 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-catalog-content\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.440181 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-utilities\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.440206 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pzfq\" (UniqueName: \"kubernetes.io/projected/e3ae7a40-4b21-490c-9964-14ccce5d0df9-kube-api-access-7pzfq\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.440533 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-catalog-content\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.440598 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-utilities\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.459548 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pzfq\" (UniqueName: \"kubernetes.io/projected/e3ae7a40-4b21-490c-9964-14ccce5d0df9-kube-api-access-7pzfq\") pod \"certified-operators-7zcm4\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:04 crc kubenswrapper[4740]: I1009 11:23:04.638624 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:05 crc kubenswrapper[4740]: I1009 11:23:05.137946 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zcm4"] Oct 09 11:23:05 crc kubenswrapper[4740]: I1009 11:23:05.647890 4740 generic.go:334] "Generic (PLEG): container finished" podID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerID="fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a" exitCode=0 Oct 09 11:23:05 crc kubenswrapper[4740]: I1009 11:23:05.647995 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zcm4" event={"ID":"e3ae7a40-4b21-490c-9964-14ccce5d0df9","Type":"ContainerDied","Data":"fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a"} Oct 09 11:23:05 crc kubenswrapper[4740]: I1009 11:23:05.648125 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zcm4" event={"ID":"e3ae7a40-4b21-490c-9964-14ccce5d0df9","Type":"ContainerStarted","Data":"0e02b48000ecc63c491af53ea3ddac864b2990fb7bd158f85c94e88bdb276ef4"} Oct 09 11:23:05 crc kubenswrapper[4740]: I1009 11:23:05.651110 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.500847 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6bpdl"] Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.505712 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.518747 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bpdl"] Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.615366 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vff8p\" (UniqueName: \"kubernetes.io/projected/52285861-a6f6-40ae-a79d-eb6e0891dc03-kube-api-access-vff8p\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.615942 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-utilities\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.616044 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-catalog-content\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.669274 4740 generic.go:334] "Generic (PLEG): container finished" podID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerID="4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4" exitCode=0 Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.669335 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zcm4" event={"ID":"e3ae7a40-4b21-490c-9964-14ccce5d0df9","Type":"ContainerDied","Data":"4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4"} Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.718414 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-utilities\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.718549 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-catalog-content\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.718695 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vff8p\" (UniqueName: \"kubernetes.io/projected/52285861-a6f6-40ae-a79d-eb6e0891dc03-kube-api-access-vff8p\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.719125 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-catalog-content\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.719125 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-utilities\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.740629 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vff8p\" (UniqueName: \"kubernetes.io/projected/52285861-a6f6-40ae-a79d-eb6e0891dc03-kube-api-access-vff8p\") pod \"community-operators-6bpdl\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:07 crc kubenswrapper[4740]: I1009 11:23:07.830144 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:08 crc kubenswrapper[4740]: I1009 11:23:08.352396 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bpdl"] Oct 09 11:23:08 crc kubenswrapper[4740]: W1009 11:23:08.361611 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52285861_a6f6_40ae_a79d_eb6e0891dc03.slice/crio-36a4c3782f8eeca0afc124028f7e7b8828a6167fe525d5f9dbeb70815108a227 WatchSource:0}: Error finding container 36a4c3782f8eeca0afc124028f7e7b8828a6167fe525d5f9dbeb70815108a227: Status 404 returned error can't find the container with id 36a4c3782f8eeca0afc124028f7e7b8828a6167fe525d5f9dbeb70815108a227 Oct 09 11:23:08 crc kubenswrapper[4740]: I1009 11:23:08.680412 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpdl" event={"ID":"52285861-a6f6-40ae-a79d-eb6e0891dc03","Type":"ContainerStarted","Data":"bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06"} Oct 09 11:23:08 crc kubenswrapper[4740]: I1009 11:23:08.680816 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpdl" event={"ID":"52285861-a6f6-40ae-a79d-eb6e0891dc03","Type":"ContainerStarted","Data":"36a4c3782f8eeca0afc124028f7e7b8828a6167fe525d5f9dbeb70815108a227"} Oct 09 11:23:09 crc kubenswrapper[4740]: I1009 11:23:09.691833 4740 generic.go:334] "Generic (PLEG): container finished" podID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerID="bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06" exitCode=0 Oct 09 11:23:09 crc kubenswrapper[4740]: I1009 11:23:09.691910 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpdl" event={"ID":"52285861-a6f6-40ae-a79d-eb6e0891dc03","Type":"ContainerDied","Data":"bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06"} Oct 09 11:23:09 crc kubenswrapper[4740]: I1009 11:23:09.696290 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zcm4" event={"ID":"e3ae7a40-4b21-490c-9964-14ccce5d0df9","Type":"ContainerStarted","Data":"606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6"} Oct 09 11:23:09 crc kubenswrapper[4740]: I1009 11:23:09.756657 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zcm4" podStartSLOduration=2.9119406 podStartE2EDuration="5.75664027s" podCreationTimestamp="2025-10-09 11:23:04 +0000 UTC" firstStartedPulling="2025-10-09 11:23:05.650868928 +0000 UTC m=+3324.613069299" lastFinishedPulling="2025-10-09 11:23:08.495568578 +0000 UTC m=+3327.457768969" observedRunningTime="2025-10-09 11:23:09.749161279 +0000 UTC m=+3328.711361700" watchObservedRunningTime="2025-10-09 11:23:09.75664027 +0000 UTC m=+3328.718840661" Oct 09 11:23:11 crc kubenswrapper[4740]: I1009 11:23:11.717793 4740 generic.go:334] "Generic (PLEG): container finished" podID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerID="2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef" exitCode=0 Oct 09 11:23:11 crc kubenswrapper[4740]: I1009 11:23:11.717905 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpdl" event={"ID":"52285861-a6f6-40ae-a79d-eb6e0891dc03","Type":"ContainerDied","Data":"2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef"} Oct 09 11:23:13 crc kubenswrapper[4740]: I1009 11:23:13.743241 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpdl" event={"ID":"52285861-a6f6-40ae-a79d-eb6e0891dc03","Type":"ContainerStarted","Data":"f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325"} Oct 09 11:23:13 crc kubenswrapper[4740]: I1009 11:23:13.772812 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6bpdl" podStartSLOduration=3.9535159589999997 podStartE2EDuration="6.772787718s" podCreationTimestamp="2025-10-09 11:23:07 +0000 UTC" firstStartedPulling="2025-10-09 11:23:09.695049307 +0000 UTC m=+3328.657249718" lastFinishedPulling="2025-10-09 11:23:12.514321096 +0000 UTC m=+3331.476521477" observedRunningTime="2025-10-09 11:23:13.762598565 +0000 UTC m=+3332.724798996" watchObservedRunningTime="2025-10-09 11:23:13.772787718 +0000 UTC m=+3332.734988099" Oct 09 11:23:14 crc kubenswrapper[4740]: I1009 11:23:14.639660 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:14 crc kubenswrapper[4740]: I1009 11:23:14.639994 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:14 crc kubenswrapper[4740]: I1009 11:23:14.717278 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:14 crc kubenswrapper[4740]: I1009 11:23:14.805215 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:16 crc kubenswrapper[4740]: I1009 11:23:16.089385 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zcm4"] Oct 09 11:23:16 crc kubenswrapper[4740]: I1009 11:23:16.771437 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zcm4" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerName="registry-server" containerID="cri-o://606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6" gracePeriod=2 Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.258332 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.322315 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-utilities\") pod \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.322384 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pzfq\" (UniqueName: \"kubernetes.io/projected/e3ae7a40-4b21-490c-9964-14ccce5d0df9-kube-api-access-7pzfq\") pod \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.322487 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-catalog-content\") pod \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\" (UID: \"e3ae7a40-4b21-490c-9964-14ccce5d0df9\") " Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.323369 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-utilities" (OuterVolumeSpecName: "utilities") pod "e3ae7a40-4b21-490c-9964-14ccce5d0df9" (UID: "e3ae7a40-4b21-490c-9964-14ccce5d0df9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.329649 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ae7a40-4b21-490c-9964-14ccce5d0df9-kube-api-access-7pzfq" (OuterVolumeSpecName: "kube-api-access-7pzfq") pod "e3ae7a40-4b21-490c-9964-14ccce5d0df9" (UID: "e3ae7a40-4b21-490c-9964-14ccce5d0df9"). InnerVolumeSpecName "kube-api-access-7pzfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.370288 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3ae7a40-4b21-490c-9964-14ccce5d0df9" (UID: "e3ae7a40-4b21-490c-9964-14ccce5d0df9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.424212 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.424251 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pzfq\" (UniqueName: \"kubernetes.io/projected/e3ae7a40-4b21-490c-9964-14ccce5d0df9-kube-api-access-7pzfq\") on node \"crc\" DevicePath \"\"" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.424264 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ae7a40-4b21-490c-9964-14ccce5d0df9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.782828 4740 generic.go:334] "Generic (PLEG): container finished" podID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerID="606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6" exitCode=0 Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.782878 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zcm4" event={"ID":"e3ae7a40-4b21-490c-9964-14ccce5d0df9","Type":"ContainerDied","Data":"606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6"} Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.782908 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zcm4" event={"ID":"e3ae7a40-4b21-490c-9964-14ccce5d0df9","Type":"ContainerDied","Data":"0e02b48000ecc63c491af53ea3ddac864b2990fb7bd158f85c94e88bdb276ef4"} Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.782928 4740 scope.go:117] "RemoveContainer" containerID="606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.783063 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zcm4" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.811141 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zcm4"] Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.812425 4740 scope.go:117] "RemoveContainer" containerID="4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.822499 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zcm4"] Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.830723 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.830801 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.846931 4740 scope.go:117] "RemoveContainer" containerID="fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.881387 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.894723 4740 scope.go:117] "RemoveContainer" containerID="606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6" Oct 09 11:23:17 crc kubenswrapper[4740]: E1009 11:23:17.896594 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6\": container with ID starting with 606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6 not found: ID does not exist" containerID="606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.896642 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6"} err="failed to get container status \"606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6\": rpc error: code = NotFound desc = could not find container \"606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6\": container with ID starting with 606e1bdebc41a25e7e187036c4b889dabedc8f3fc38c7e842628d1c7f991b4e6 not found: ID does not exist" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.896668 4740 scope.go:117] "RemoveContainer" containerID="4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4" Oct 09 11:23:17 crc kubenswrapper[4740]: E1009 11:23:17.903107 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4\": container with ID starting with 4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4 not found: ID does not exist" containerID="4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.903169 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4"} err="failed to get container status \"4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4\": rpc error: code = NotFound desc = could not find container \"4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4\": container with ID starting with 4c1abd924b32308d3b6fc261eb4899a0184bfb7b6ee7c047f0c07da3b2a335b4 not found: ID does not exist" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.903202 4740 scope.go:117] "RemoveContainer" containerID="fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a" Oct 09 11:23:17 crc kubenswrapper[4740]: E1009 11:23:17.903984 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a\": container with ID starting with fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a not found: ID does not exist" containerID="fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a" Oct 09 11:23:17 crc kubenswrapper[4740]: I1009 11:23:17.904039 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a"} err="failed to get container status \"fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a\": rpc error: code = NotFound desc = could not find container \"fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a\": container with ID starting with fa2c1daac3754717cc85e1324ebea473581dfd7297039fef0c7cca4b220ce95a not found: ID does not exist" Oct 09 11:23:18 crc kubenswrapper[4740]: I1009 11:23:18.880480 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:19 crc kubenswrapper[4740]: I1009 11:23:19.765997 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" path="/var/lib/kubelet/pods/e3ae7a40-4b21-490c-9964-14ccce5d0df9/volumes" Oct 09 11:23:20 crc kubenswrapper[4740]: I1009 11:23:20.295743 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bpdl"] Oct 09 11:23:20 crc kubenswrapper[4740]: I1009 11:23:20.818407 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6bpdl" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerName="registry-server" containerID="cri-o://f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325" gracePeriod=2 Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.344874 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.405432 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-catalog-content\") pod \"52285861-a6f6-40ae-a79d-eb6e0891dc03\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.405517 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vff8p\" (UniqueName: \"kubernetes.io/projected/52285861-a6f6-40ae-a79d-eb6e0891dc03-kube-api-access-vff8p\") pod \"52285861-a6f6-40ae-a79d-eb6e0891dc03\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.405582 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-utilities\") pod \"52285861-a6f6-40ae-a79d-eb6e0891dc03\" (UID: \"52285861-a6f6-40ae-a79d-eb6e0891dc03\") " Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.406979 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-utilities" (OuterVolumeSpecName: "utilities") pod "52285861-a6f6-40ae-a79d-eb6e0891dc03" (UID: "52285861-a6f6-40ae-a79d-eb6e0891dc03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.412626 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52285861-a6f6-40ae-a79d-eb6e0891dc03-kube-api-access-vff8p" (OuterVolumeSpecName: "kube-api-access-vff8p") pod "52285861-a6f6-40ae-a79d-eb6e0891dc03" (UID: "52285861-a6f6-40ae-a79d-eb6e0891dc03"). InnerVolumeSpecName "kube-api-access-vff8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.452881 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52285861-a6f6-40ae-a79d-eb6e0891dc03" (UID: "52285861-a6f6-40ae-a79d-eb6e0891dc03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.507791 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.507821 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vff8p\" (UniqueName: \"kubernetes.io/projected/52285861-a6f6-40ae-a79d-eb6e0891dc03-kube-api-access-vff8p\") on node \"crc\" DevicePath \"\"" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.507832 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52285861-a6f6-40ae-a79d-eb6e0891dc03-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.829128 4740 generic.go:334] "Generic (PLEG): container finished" podID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerID="f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325" exitCode=0 Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.829197 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bpdl" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.829232 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpdl" event={"ID":"52285861-a6f6-40ae-a79d-eb6e0891dc03","Type":"ContainerDied","Data":"f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325"} Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.829663 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpdl" event={"ID":"52285861-a6f6-40ae-a79d-eb6e0891dc03","Type":"ContainerDied","Data":"36a4c3782f8eeca0afc124028f7e7b8828a6167fe525d5f9dbeb70815108a227"} Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.829692 4740 scope.go:117] "RemoveContainer" containerID="f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.858878 4740 scope.go:117] "RemoveContainer" containerID="2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.860582 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bpdl"] Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.869304 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6bpdl"] Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.886977 4740 scope.go:117] "RemoveContainer" containerID="bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.929064 4740 scope.go:117] "RemoveContainer" containerID="f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325" Oct 09 11:23:21 crc kubenswrapper[4740]: E1009 11:23:21.929437 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325\": container with ID starting with f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325 not found: ID does not exist" containerID="f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.929470 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325"} err="failed to get container status \"f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325\": rpc error: code = NotFound desc = could not find container \"f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325\": container with ID starting with f138a6c72093cb425aa04e1f3ce97e5ed2f5a61b4a54bb25ca6bc4d1deb4a325 not found: ID does not exist" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.929491 4740 scope.go:117] "RemoveContainer" containerID="2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef" Oct 09 11:23:21 crc kubenswrapper[4740]: E1009 11:23:21.930076 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef\": container with ID starting with 2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef not found: ID does not exist" containerID="2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.930127 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef"} err="failed to get container status \"2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef\": rpc error: code = NotFound desc = could not find container \"2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef\": container with ID starting with 2ea0349f0c1768999c9a908ca5adb542e965b9a29dac48828b30acedc3d8b1ef not found: ID does not exist" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.930161 4740 scope.go:117] "RemoveContainer" containerID="bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06" Oct 09 11:23:21 crc kubenswrapper[4740]: E1009 11:23:21.930590 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06\": container with ID starting with bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06 not found: ID does not exist" containerID="bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06" Oct 09 11:23:21 crc kubenswrapper[4740]: I1009 11:23:21.930660 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06"} err="failed to get container status \"bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06\": rpc error: code = NotFound desc = could not find container \"bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06\": container with ID starting with bb5e246df83822351193cfd8550052699b7ebc3a316592fe919c69bdcd4a4c06 not found: ID does not exist" Oct 09 11:23:23 crc kubenswrapper[4740]: I1009 11:23:23.767742 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" path="/var/lib/kubelet/pods/52285861-a6f6-40ae-a79d-eb6e0891dc03/volumes" Oct 09 11:23:35 crc kubenswrapper[4740]: I1009 11:23:35.407550 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:23:35 crc kubenswrapper[4740]: I1009 11:23:35.408377 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:24:05 crc kubenswrapper[4740]: I1009 11:24:05.408199 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:24:05 crc kubenswrapper[4740]: I1009 11:24:05.410898 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:24:35 crc kubenswrapper[4740]: I1009 11:24:35.408209 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:24:35 crc kubenswrapper[4740]: I1009 11:24:35.409150 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:24:35 crc kubenswrapper[4740]: I1009 11:24:35.409238 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 11:24:35 crc kubenswrapper[4740]: I1009 11:24:35.410492 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51918591255fd9bc84070769a7fb279ef80f14e662d5784322c40cfbec726c46"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 11:24:35 crc kubenswrapper[4740]: I1009 11:24:35.410627 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://51918591255fd9bc84070769a7fb279ef80f14e662d5784322c40cfbec726c46" gracePeriod=600 Oct 09 11:24:35 crc kubenswrapper[4740]: I1009 11:24:35.588830 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="51918591255fd9bc84070769a7fb279ef80f14e662d5784322c40cfbec726c46" exitCode=0 Oct 09 11:24:35 crc kubenswrapper[4740]: I1009 11:24:35.588862 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"51918591255fd9bc84070769a7fb279ef80f14e662d5784322c40cfbec726c46"} Oct 09 11:24:35 crc kubenswrapper[4740]: I1009 11:24:35.588928 4740 scope.go:117] "RemoveContainer" containerID="ef9c61ecb0095425cc9fd6df4062a775d3b24b89a0c2e4beeff487b832e699e8" Oct 09 11:24:36 crc kubenswrapper[4740]: I1009 11:24:36.600140 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930"} Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.331977 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gxlwg"] Oct 09 11:25:00 crc kubenswrapper[4740]: E1009 11:25:00.333568 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerName="registry-server" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.333603 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerName="registry-server" Oct 09 11:25:00 crc kubenswrapper[4740]: E1009 11:25:00.333683 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerName="extract-utilities" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.333701 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerName="extract-utilities" Oct 09 11:25:00 crc kubenswrapper[4740]: E1009 11:25:00.333737 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerName="extract-content" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.333790 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerName="extract-content" Oct 09 11:25:00 crc kubenswrapper[4740]: E1009 11:25:00.333811 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerName="extract-utilities" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.333831 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerName="extract-utilities" Oct 09 11:25:00 crc kubenswrapper[4740]: E1009 11:25:00.333868 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerName="extract-content" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.333886 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerName="extract-content" Oct 09 11:25:00 crc kubenswrapper[4740]: E1009 11:25:00.333912 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerName="registry-server" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.333929 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerName="registry-server" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.334406 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="52285861-a6f6-40ae-a79d-eb6e0891dc03" containerName="registry-server" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.334491 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ae7a40-4b21-490c-9964-14ccce5d0df9" containerName="registry-server" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.338031 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.353154 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gxlwg"] Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.531889 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-catalog-content\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.532304 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-utilities\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.532460 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnh5w\" (UniqueName: \"kubernetes.io/projected/17b3ccaf-c445-4a66-b772-2e57a1207f60-kube-api-access-fnh5w\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.634813 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-catalog-content\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.634914 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-utilities\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.634952 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnh5w\" (UniqueName: \"kubernetes.io/projected/17b3ccaf-c445-4a66-b772-2e57a1207f60-kube-api-access-fnh5w\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.635738 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-utilities\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.636052 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-catalog-content\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.659116 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnh5w\" (UniqueName: \"kubernetes.io/projected/17b3ccaf-c445-4a66-b772-2e57a1207f60-kube-api-access-fnh5w\") pod \"redhat-marketplace-gxlwg\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:00 crc kubenswrapper[4740]: I1009 11:25:00.674012 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:01 crc kubenswrapper[4740]: I1009 11:25:01.105733 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gxlwg"] Oct 09 11:25:01 crc kubenswrapper[4740]: I1009 11:25:01.871980 4740 generic.go:334] "Generic (PLEG): container finished" podID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerID="b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0" exitCode=0 Oct 09 11:25:01 crc kubenswrapper[4740]: I1009 11:25:01.872069 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gxlwg" event={"ID":"17b3ccaf-c445-4a66-b772-2e57a1207f60","Type":"ContainerDied","Data":"b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0"} Oct 09 11:25:01 crc kubenswrapper[4740]: I1009 11:25:01.872312 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gxlwg" event={"ID":"17b3ccaf-c445-4a66-b772-2e57a1207f60","Type":"ContainerStarted","Data":"0c74a1917bd53b75c7d06866aab7b5f6e03c97c86d76e817c0b6fc6001a6586c"} Oct 09 11:25:03 crc kubenswrapper[4740]: I1009 11:25:03.894894 4740 generic.go:334] "Generic (PLEG): container finished" podID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerID="2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e" exitCode=0 Oct 09 11:25:03 crc kubenswrapper[4740]: I1009 11:25:03.895001 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gxlwg" event={"ID":"17b3ccaf-c445-4a66-b772-2e57a1207f60","Type":"ContainerDied","Data":"2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e"} Oct 09 11:25:04 crc kubenswrapper[4740]: I1009 11:25:04.907674 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gxlwg" event={"ID":"17b3ccaf-c445-4a66-b772-2e57a1207f60","Type":"ContainerStarted","Data":"87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f"} Oct 09 11:25:10 crc kubenswrapper[4740]: I1009 11:25:10.674667 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:10 crc kubenswrapper[4740]: I1009 11:25:10.675346 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:10 crc kubenswrapper[4740]: I1009 11:25:10.733350 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:10 crc kubenswrapper[4740]: I1009 11:25:10.763009 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gxlwg" podStartSLOduration=8.242002935 podStartE2EDuration="10.762985358s" podCreationTimestamp="2025-10-09 11:25:00 +0000 UTC" firstStartedPulling="2025-10-09 11:25:01.873597139 +0000 UTC m=+3440.835797520" lastFinishedPulling="2025-10-09 11:25:04.394579562 +0000 UTC m=+3443.356779943" observedRunningTime="2025-10-09 11:25:04.92694835 +0000 UTC m=+3443.889148741" watchObservedRunningTime="2025-10-09 11:25:10.762985358 +0000 UTC m=+3449.725185759" Oct 09 11:25:11 crc kubenswrapper[4740]: I1009 11:25:11.031132 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:11 crc kubenswrapper[4740]: I1009 11:25:11.077151 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gxlwg"] Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.008039 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gxlwg" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerName="registry-server" containerID="cri-o://87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f" gracePeriod=2 Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.520062 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.686440 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnh5w\" (UniqueName: \"kubernetes.io/projected/17b3ccaf-c445-4a66-b772-2e57a1207f60-kube-api-access-fnh5w\") pod \"17b3ccaf-c445-4a66-b772-2e57a1207f60\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.686616 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-utilities\") pod \"17b3ccaf-c445-4a66-b772-2e57a1207f60\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.686714 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-catalog-content\") pod \"17b3ccaf-c445-4a66-b772-2e57a1207f60\" (UID: \"17b3ccaf-c445-4a66-b772-2e57a1207f60\") " Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.687603 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-utilities" (OuterVolumeSpecName: "utilities") pod "17b3ccaf-c445-4a66-b772-2e57a1207f60" (UID: "17b3ccaf-c445-4a66-b772-2e57a1207f60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.693228 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b3ccaf-c445-4a66-b772-2e57a1207f60-kube-api-access-fnh5w" (OuterVolumeSpecName: "kube-api-access-fnh5w") pod "17b3ccaf-c445-4a66-b772-2e57a1207f60" (UID: "17b3ccaf-c445-4a66-b772-2e57a1207f60"). InnerVolumeSpecName "kube-api-access-fnh5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.706437 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17b3ccaf-c445-4a66-b772-2e57a1207f60" (UID: "17b3ccaf-c445-4a66-b772-2e57a1207f60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.789040 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.789088 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnh5w\" (UniqueName: \"kubernetes.io/projected/17b3ccaf-c445-4a66-b772-2e57a1207f60-kube-api-access-fnh5w\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:13 crc kubenswrapper[4740]: I1009 11:25:13.789111 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b3ccaf-c445-4a66-b772-2e57a1207f60-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.024871 4740 generic.go:334] "Generic (PLEG): container finished" podID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerID="87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f" exitCode=0 Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.024983 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gxlwg" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.025010 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gxlwg" event={"ID":"17b3ccaf-c445-4a66-b772-2e57a1207f60","Type":"ContainerDied","Data":"87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f"} Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.025462 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gxlwg" event={"ID":"17b3ccaf-c445-4a66-b772-2e57a1207f60","Type":"ContainerDied","Data":"0c74a1917bd53b75c7d06866aab7b5f6e03c97c86d76e817c0b6fc6001a6586c"} Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.025495 4740 scope.go:117] "RemoveContainer" containerID="87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.068956 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gxlwg"] Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.069277 4740 scope.go:117] "RemoveContainer" containerID="2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.079938 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gxlwg"] Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.096574 4740 scope.go:117] "RemoveContainer" containerID="b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.166450 4740 scope.go:117] "RemoveContainer" containerID="87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f" Oct 09 11:25:14 crc kubenswrapper[4740]: E1009 11:25:14.167207 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f\": container with ID starting with 87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f not found: ID does not exist" containerID="87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.167294 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f"} err="failed to get container status \"87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f\": rpc error: code = NotFound desc = could not find container \"87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f\": container with ID starting with 87529eec5159e8d58ecf6329f8e2bd93ca910335491427cf057d34e796ecfd3f not found: ID does not exist" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.167343 4740 scope.go:117] "RemoveContainer" containerID="2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e" Oct 09 11:25:14 crc kubenswrapper[4740]: E1009 11:25:14.167701 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e\": container with ID starting with 2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e not found: ID does not exist" containerID="2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.167736 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e"} err="failed to get container status \"2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e\": rpc error: code = NotFound desc = could not find container \"2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e\": container with ID starting with 2e1ad29c07501454f6b01db3e7675a2ddd9727e330fe7455c992b1666dc5c29e not found: ID does not exist" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.167786 4740 scope.go:117] "RemoveContainer" containerID="b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0" Oct 09 11:25:14 crc kubenswrapper[4740]: E1009 11:25:14.168092 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0\": container with ID starting with b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0 not found: ID does not exist" containerID="b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0" Oct 09 11:25:14 crc kubenswrapper[4740]: I1009 11:25:14.168128 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0"} err="failed to get container status \"b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0\": rpc error: code = NotFound desc = could not find container \"b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0\": container with ID starting with b81017d76ade313a04d8168bab4f35840fc64ad1ca5f13e1f9065a378daa74f0 not found: ID does not exist" Oct 09 11:25:15 crc kubenswrapper[4740]: I1009 11:25:15.766417 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" path="/var/lib/kubelet/pods/17b3ccaf-c445-4a66-b772-2e57a1207f60/volumes" Oct 09 11:25:25 crc kubenswrapper[4740]: I1009 11:25:25.151508 4740 generic.go:334] "Generic (PLEG): container finished" podID="4c1a2aba-0872-4bef-9bad-0ba37788423d" containerID="c8f27d502ab7f6756924b00c462816148003392ebd595d74ba4d2e028498dcde" exitCode=0 Oct 09 11:25:25 crc kubenswrapper[4740]: I1009 11:25:25.151594 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4c1a2aba-0872-4bef-9bad-0ba37788423d","Type":"ContainerDied","Data":"c8f27d502ab7f6756924b00c462816148003392ebd595d74ba4d2e028498dcde"} Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.534433 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646089 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-config-data\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646186 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ca-certs\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646255 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-workdir\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646340 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ssh-key\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646425 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-temporary\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646504 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgskx\" (UniqueName: \"kubernetes.io/projected/4c1a2aba-0872-4bef-9bad-0ba37788423d-kube-api-access-dgskx\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646554 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646628 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config-secret\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646704 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config\") pod \"4c1a2aba-0872-4bef-9bad-0ba37788423d\" (UID: \"4c1a2aba-0872-4bef-9bad-0ba37788423d\") " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.646973 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.647441 4740 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.647481 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-config-data" (OuterVolumeSpecName: "config-data") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.651626 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.653389 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1a2aba-0872-4bef-9bad-0ba37788423d-kube-api-access-dgskx" (OuterVolumeSpecName: "kube-api-access-dgskx") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "kube-api-access-dgskx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.654986 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.679264 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.683964 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.698690 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.703603 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4c1a2aba-0872-4bef-9bad-0ba37788423d" (UID: "4c1a2aba-0872-4bef-9bad-0ba37788423d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.749396 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.749448 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.749469 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c1a2aba-0872-4bef-9bad-0ba37788423d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.749487 4740 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.749508 4740 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4c1a2aba-0872-4bef-9bad-0ba37788423d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.749526 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c1a2aba-0872-4bef-9bad-0ba37788423d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.749543 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgskx\" (UniqueName: \"kubernetes.io/projected/4c1a2aba-0872-4bef-9bad-0ba37788423d-kube-api-access-dgskx\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.749595 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.789107 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 09 11:25:26 crc kubenswrapper[4740]: I1009 11:25:26.852253 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 09 11:25:27 crc kubenswrapper[4740]: I1009 11:25:27.179828 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4c1a2aba-0872-4bef-9bad-0ba37788423d","Type":"ContainerDied","Data":"914ffb10720733b1eafe6397d2f881501953d964707b9afbdd8841fc8d96ed25"} Oct 09 11:25:27 crc kubenswrapper[4740]: I1009 11:25:27.179886 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="914ffb10720733b1eafe6397d2f881501953d964707b9afbdd8841fc8d96ed25" Oct 09 11:25:27 crc kubenswrapper[4740]: I1009 11:25:27.180010 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.342394 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 11:25:34 crc kubenswrapper[4740]: E1009 11:25:34.345090 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerName="extract-utilities" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.345463 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerName="extract-utilities" Oct 09 11:25:34 crc kubenswrapper[4740]: E1009 11:25:34.345620 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerName="extract-content" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.345753 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerName="extract-content" Oct 09 11:25:34 crc kubenswrapper[4740]: E1009 11:25:34.345942 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1a2aba-0872-4bef-9bad-0ba37788423d" containerName="tempest-tests-tempest-tests-runner" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.346079 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1a2aba-0872-4bef-9bad-0ba37788423d" containerName="tempest-tests-tempest-tests-runner" Oct 09 11:25:34 crc kubenswrapper[4740]: E1009 11:25:34.346247 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerName="registry-server" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.346365 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerName="registry-server" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.346883 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1a2aba-0872-4bef-9bad-0ba37788423d" containerName="tempest-tests-tempest-tests-runner" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.347049 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b3ccaf-c445-4a66-b772-2e57a1207f60" containerName="registry-server" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.348339 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.359549 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.361069 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6zqq5" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.511721 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0486d8ef-3b23-4d76-9764-a3d48c174482\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.511789 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qjhz\" (UniqueName: \"kubernetes.io/projected/0486d8ef-3b23-4d76-9764-a3d48c174482-kube-api-access-5qjhz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0486d8ef-3b23-4d76-9764-a3d48c174482\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.613966 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0486d8ef-3b23-4d76-9764-a3d48c174482\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.614044 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qjhz\" (UniqueName: \"kubernetes.io/projected/0486d8ef-3b23-4d76-9764-a3d48c174482-kube-api-access-5qjhz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0486d8ef-3b23-4d76-9764-a3d48c174482\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.615026 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0486d8ef-3b23-4d76-9764-a3d48c174482\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.633496 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qjhz\" (UniqueName: \"kubernetes.io/projected/0486d8ef-3b23-4d76-9764-a3d48c174482-kube-api-access-5qjhz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0486d8ef-3b23-4d76-9764-a3d48c174482\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.640537 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0486d8ef-3b23-4d76-9764-a3d48c174482\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.689606 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 11:25:34 crc kubenswrapper[4740]: I1009 11:25:34.946276 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 11:25:35 crc kubenswrapper[4740]: I1009 11:25:35.276606 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0486d8ef-3b23-4d76-9764-a3d48c174482","Type":"ContainerStarted","Data":"9e45c5e7b3ab35f9943383c12b5b0c5a1d7b42c01bee1547e31e754adaece0d6"} Oct 09 11:25:36 crc kubenswrapper[4740]: I1009 11:25:36.299064 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0486d8ef-3b23-4d76-9764-a3d48c174482","Type":"ContainerStarted","Data":"6cb6bd8879e5e46ba1ec970376694fcfaf11f96bfd4c8a3eaf0b95280dd2ca76"} Oct 09 11:25:36 crc kubenswrapper[4740]: I1009 11:25:36.324314 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.277704838 podStartE2EDuration="2.324288544s" podCreationTimestamp="2025-10-09 11:25:34 +0000 UTC" firstStartedPulling="2025-10-09 11:25:34.968023988 +0000 UTC m=+3473.930224369" lastFinishedPulling="2025-10-09 11:25:36.014607664 +0000 UTC m=+3474.976808075" observedRunningTime="2025-10-09 11:25:36.315440106 +0000 UTC m=+3475.277640487" watchObservedRunningTime="2025-10-09 11:25:36.324288544 +0000 UTC m=+3475.286488925" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.673060 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mq5nd/must-gather-kszhj"] Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.675030 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.678598 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mq5nd"/"kube-root-ca.crt" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.678865 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mq5nd"/"openshift-service-ca.crt" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.679069 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mq5nd"/"default-dockercfg-lgn8v" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.682316 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mq5nd/must-gather-kszhj"] Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.828393 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9669ea77-a286-4fe0-8a3f-26653ca161e5-must-gather-output\") pod \"must-gather-kszhj\" (UID: \"9669ea77-a286-4fe0-8a3f-26653ca161e5\") " pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.828467 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvtlq\" (UniqueName: \"kubernetes.io/projected/9669ea77-a286-4fe0-8a3f-26653ca161e5-kube-api-access-cvtlq\") pod \"must-gather-kszhj\" (UID: \"9669ea77-a286-4fe0-8a3f-26653ca161e5\") " pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.929685 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9669ea77-a286-4fe0-8a3f-26653ca161e5-must-gather-output\") pod \"must-gather-kszhj\" (UID: \"9669ea77-a286-4fe0-8a3f-26653ca161e5\") " pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.929797 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvtlq\" (UniqueName: \"kubernetes.io/projected/9669ea77-a286-4fe0-8a3f-26653ca161e5-kube-api-access-cvtlq\") pod \"must-gather-kszhj\" (UID: \"9669ea77-a286-4fe0-8a3f-26653ca161e5\") " pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.930276 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9669ea77-a286-4fe0-8a3f-26653ca161e5-must-gather-output\") pod \"must-gather-kszhj\" (UID: \"9669ea77-a286-4fe0-8a3f-26653ca161e5\") " pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.947246 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvtlq\" (UniqueName: \"kubernetes.io/projected/9669ea77-a286-4fe0-8a3f-26653ca161e5-kube-api-access-cvtlq\") pod \"must-gather-kszhj\" (UID: \"9669ea77-a286-4fe0-8a3f-26653ca161e5\") " pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:25:53 crc kubenswrapper[4740]: I1009 11:25:53.998660 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:25:54 crc kubenswrapper[4740]: I1009 11:25:54.473371 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mq5nd/must-gather-kszhj"] Oct 09 11:25:54 crc kubenswrapper[4740]: W1009 11:25:54.479844 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9669ea77_a286_4fe0_8a3f_26653ca161e5.slice/crio-e024da2d64e1fa94d160207e873197b2cb96da805819cc6471d66ef0f03f9bc6 WatchSource:0}: Error finding container e024da2d64e1fa94d160207e873197b2cb96da805819cc6471d66ef0f03f9bc6: Status 404 returned error can't find the container with id e024da2d64e1fa94d160207e873197b2cb96da805819cc6471d66ef0f03f9bc6 Oct 09 11:25:54 crc kubenswrapper[4740]: I1009 11:25:54.493152 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/must-gather-kszhj" event={"ID":"9669ea77-a286-4fe0-8a3f-26653ca161e5","Type":"ContainerStarted","Data":"e024da2d64e1fa94d160207e873197b2cb96da805819cc6471d66ef0f03f9bc6"} Oct 09 11:25:59 crc kubenswrapper[4740]: I1009 11:25:59.534911 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/must-gather-kszhj" event={"ID":"9669ea77-a286-4fe0-8a3f-26653ca161e5","Type":"ContainerStarted","Data":"228aae2a2727c17fc5a8fcee31820460e04dc56143f6e736396da3e1663b9a4a"} Oct 09 11:25:59 crc kubenswrapper[4740]: I1009 11:25:59.535814 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/must-gather-kszhj" event={"ID":"9669ea77-a286-4fe0-8a3f-26653ca161e5","Type":"ContainerStarted","Data":"56c4518f771884721960d702faf5b34d5899cb6efcd70ffa8f72aef91632b1e1"} Oct 09 11:25:59 crc kubenswrapper[4740]: I1009 11:25:59.557679 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mq5nd/must-gather-kszhj" podStartSLOduration=2.736603518 podStartE2EDuration="6.55765587s" podCreationTimestamp="2025-10-09 11:25:53 +0000 UTC" firstStartedPulling="2025-10-09 11:25:54.482307807 +0000 UTC m=+3493.444508228" lastFinishedPulling="2025-10-09 11:25:58.303360199 +0000 UTC m=+3497.265560580" observedRunningTime="2025-10-09 11:25:59.551639728 +0000 UTC m=+3498.513840109" watchObservedRunningTime="2025-10-09 11:25:59.55765587 +0000 UTC m=+3498.519856251" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.026887 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-xqhdc"] Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.030432 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.155809 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmdk7\" (UniqueName: \"kubernetes.io/projected/d63eacd0-27c4-40ba-9c20-74ad99d87b84-kube-api-access-cmdk7\") pod \"crc-debug-xqhdc\" (UID: \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\") " pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.156154 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63eacd0-27c4-40ba-9c20-74ad99d87b84-host\") pod \"crc-debug-xqhdc\" (UID: \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\") " pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.258173 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdk7\" (UniqueName: \"kubernetes.io/projected/d63eacd0-27c4-40ba-9c20-74ad99d87b84-kube-api-access-cmdk7\") pod \"crc-debug-xqhdc\" (UID: \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\") " pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.258409 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63eacd0-27c4-40ba-9c20-74ad99d87b84-host\") pod \"crc-debug-xqhdc\" (UID: \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\") " pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.258512 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63eacd0-27c4-40ba-9c20-74ad99d87b84-host\") pod \"crc-debug-xqhdc\" (UID: \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\") " pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.275217 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmdk7\" (UniqueName: \"kubernetes.io/projected/d63eacd0-27c4-40ba-9c20-74ad99d87b84-kube-api-access-cmdk7\") pod \"crc-debug-xqhdc\" (UID: \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\") " pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.348916 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:02 crc kubenswrapper[4740]: I1009 11:26:02.564322 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" event={"ID":"d63eacd0-27c4-40ba-9c20-74ad99d87b84","Type":"ContainerStarted","Data":"71b82f974fa6112b3652e83c42f41ef5762bbead0acd8570df55eb2c793b093e"} Oct 09 11:26:14 crc kubenswrapper[4740]: I1009 11:26:14.691058 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" event={"ID":"d63eacd0-27c4-40ba-9c20-74ad99d87b84","Type":"ContainerStarted","Data":"552c84fb0547b9edd0702a862120930ffca8694d14aa5c3a7166d7e2bc55172a"} Oct 09 11:26:14 crc kubenswrapper[4740]: I1009 11:26:14.711910 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" podStartSLOduration=1.558331742 podStartE2EDuration="12.711890591s" podCreationTimestamp="2025-10-09 11:26:02 +0000 UTC" firstStartedPulling="2025-10-09 11:26:02.379849076 +0000 UTC m=+3501.342049477" lastFinishedPulling="2025-10-09 11:26:13.533407945 +0000 UTC m=+3512.495608326" observedRunningTime="2025-10-09 11:26:14.70698616 +0000 UTC m=+3513.669186551" watchObservedRunningTime="2025-10-09 11:26:14.711890591 +0000 UTC m=+3513.674090972" Oct 09 11:26:35 crc kubenswrapper[4740]: I1009 11:26:35.407799 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:26:35 crc kubenswrapper[4740]: I1009 11:26:35.408403 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:26:51 crc kubenswrapper[4740]: I1009 11:26:51.018727 4740 generic.go:334] "Generic (PLEG): container finished" podID="d63eacd0-27c4-40ba-9c20-74ad99d87b84" containerID="552c84fb0547b9edd0702a862120930ffca8694d14aa5c3a7166d7e2bc55172a" exitCode=0 Oct 09 11:26:51 crc kubenswrapper[4740]: I1009 11:26:51.018789 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" event={"ID":"d63eacd0-27c4-40ba-9c20-74ad99d87b84","Type":"ContainerDied","Data":"552c84fb0547b9edd0702a862120930ffca8694d14aa5c3a7166d7e2bc55172a"} Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.127337 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.171463 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-xqhdc"] Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.180402 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-xqhdc"] Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.183410 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63eacd0-27c4-40ba-9c20-74ad99d87b84-host\") pod \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\" (UID: \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\") " Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.183500 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmdk7\" (UniqueName: \"kubernetes.io/projected/d63eacd0-27c4-40ba-9c20-74ad99d87b84-kube-api-access-cmdk7\") pod \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\" (UID: \"d63eacd0-27c4-40ba-9c20-74ad99d87b84\") " Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.184734 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d63eacd0-27c4-40ba-9c20-74ad99d87b84-host" (OuterVolumeSpecName: "host") pod "d63eacd0-27c4-40ba-9c20-74ad99d87b84" (UID: "d63eacd0-27c4-40ba-9c20-74ad99d87b84"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.191957 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63eacd0-27c4-40ba-9c20-74ad99d87b84-kube-api-access-cmdk7" (OuterVolumeSpecName: "kube-api-access-cmdk7") pod "d63eacd0-27c4-40ba-9c20-74ad99d87b84" (UID: "d63eacd0-27c4-40ba-9c20-74ad99d87b84"). InnerVolumeSpecName "kube-api-access-cmdk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.285952 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63eacd0-27c4-40ba-9c20-74ad99d87b84-host\") on node \"crc\" DevicePath \"\"" Oct 09 11:26:52 crc kubenswrapper[4740]: I1009 11:26:52.285988 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmdk7\" (UniqueName: \"kubernetes.io/projected/d63eacd0-27c4-40ba-9c20-74ad99d87b84-kube-api-access-cmdk7\") on node \"crc\" DevicePath \"\"" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.035006 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71b82f974fa6112b3652e83c42f41ef5762bbead0acd8570df55eb2c793b093e" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.035065 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-xqhdc" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.429489 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-t7c8h"] Oct 09 11:26:53 crc kubenswrapper[4740]: E1009 11:26:53.429911 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63eacd0-27c4-40ba-9c20-74ad99d87b84" containerName="container-00" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.429921 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63eacd0-27c4-40ba-9c20-74ad99d87b84" containerName="container-00" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.430084 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63eacd0-27c4-40ba-9c20-74ad99d87b84" containerName="container-00" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.430637 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.504911 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7lxf\" (UniqueName: \"kubernetes.io/projected/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-kube-api-access-g7lxf\") pod \"crc-debug-t7c8h\" (UID: \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\") " pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.505058 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-host\") pod \"crc-debug-t7c8h\" (UID: \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\") " pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.606086 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-host\") pod \"crc-debug-t7c8h\" (UID: \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\") " pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.606222 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7lxf\" (UniqueName: \"kubernetes.io/projected/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-kube-api-access-g7lxf\") pod \"crc-debug-t7c8h\" (UID: \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\") " pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.606220 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-host\") pod \"crc-debug-t7c8h\" (UID: \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\") " pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.631302 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7lxf\" (UniqueName: \"kubernetes.io/projected/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-kube-api-access-g7lxf\") pod \"crc-debug-t7c8h\" (UID: \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\") " pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.754163 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:53 crc kubenswrapper[4740]: I1009 11:26:53.766921 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63eacd0-27c4-40ba-9c20-74ad99d87b84" path="/var/lib/kubelet/pods/d63eacd0-27c4-40ba-9c20-74ad99d87b84/volumes" Oct 09 11:26:54 crc kubenswrapper[4740]: I1009 11:26:54.044646 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" event={"ID":"ef6a5c2d-165b-4f23-9efe-30d89f6915e1","Type":"ContainerStarted","Data":"0168a82f1393c2ad35f40ee0a56f7864d0b088d421b94bba222d28d277bb8a9e"} Oct 09 11:26:54 crc kubenswrapper[4740]: I1009 11:26:54.045105 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" event={"ID":"ef6a5c2d-165b-4f23-9efe-30d89f6915e1","Type":"ContainerStarted","Data":"9076471be8c372424cfe26fa5ee0261718224bf57483a7a1ca193f66b9abc571"} Oct 09 11:26:54 crc kubenswrapper[4740]: I1009 11:26:54.067485 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" podStartSLOduration=1.067463803 podStartE2EDuration="1.067463803s" podCreationTimestamp="2025-10-09 11:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 11:26:54.066887317 +0000 UTC m=+3553.029087698" watchObservedRunningTime="2025-10-09 11:26:54.067463803 +0000 UTC m=+3553.029664184" Oct 09 11:26:55 crc kubenswrapper[4740]: I1009 11:26:55.059251 4740 generic.go:334] "Generic (PLEG): container finished" podID="ef6a5c2d-165b-4f23-9efe-30d89f6915e1" containerID="0168a82f1393c2ad35f40ee0a56f7864d0b088d421b94bba222d28d277bb8a9e" exitCode=0 Oct 09 11:26:55 crc kubenswrapper[4740]: I1009 11:26:55.059295 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" event={"ID":"ef6a5c2d-165b-4f23-9efe-30d89f6915e1","Type":"ContainerDied","Data":"0168a82f1393c2ad35f40ee0a56f7864d0b088d421b94bba222d28d277bb8a9e"} Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.184865 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.242370 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-t7c8h"] Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.249729 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-host\") pod \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\" (UID: \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\") " Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.249844 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7lxf\" (UniqueName: \"kubernetes.io/projected/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-kube-api-access-g7lxf\") pod \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\" (UID: \"ef6a5c2d-165b-4f23-9efe-30d89f6915e1\") " Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.250017 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-host" (OuterVolumeSpecName: "host") pod "ef6a5c2d-165b-4f23-9efe-30d89f6915e1" (UID: "ef6a5c2d-165b-4f23-9efe-30d89f6915e1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.250414 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-host\") on node \"crc\" DevicePath \"\"" Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.251573 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-t7c8h"] Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.255373 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-kube-api-access-g7lxf" (OuterVolumeSpecName: "kube-api-access-g7lxf") pod "ef6a5c2d-165b-4f23-9efe-30d89f6915e1" (UID: "ef6a5c2d-165b-4f23-9efe-30d89f6915e1"). InnerVolumeSpecName "kube-api-access-g7lxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:26:56 crc kubenswrapper[4740]: I1009 11:26:56.352656 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7lxf\" (UniqueName: \"kubernetes.io/projected/ef6a5c2d-165b-4f23-9efe-30d89f6915e1-kube-api-access-g7lxf\") on node \"crc\" DevicePath \"\"" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.098894 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9076471be8c372424cfe26fa5ee0261718224bf57483a7a1ca193f66b9abc571" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.098997 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-t7c8h" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.458136 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-6tq9z"] Oct 09 11:26:57 crc kubenswrapper[4740]: E1009 11:26:57.458736 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6a5c2d-165b-4f23-9efe-30d89f6915e1" containerName="container-00" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.458776 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6a5c2d-165b-4f23-9efe-30d89f6915e1" containerName="container-00" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.459018 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6a5c2d-165b-4f23-9efe-30d89f6915e1" containerName="container-00" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.459858 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.475579 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjqn7\" (UniqueName: \"kubernetes.io/projected/f5f13896-f36c-4860-9053-8c6a177c21df-kube-api-access-cjqn7\") pod \"crc-debug-6tq9z\" (UID: \"f5f13896-f36c-4860-9053-8c6a177c21df\") " pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.475849 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5f13896-f36c-4860-9053-8c6a177c21df-host\") pod \"crc-debug-6tq9z\" (UID: \"f5f13896-f36c-4860-9053-8c6a177c21df\") " pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.577911 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5f13896-f36c-4860-9053-8c6a177c21df-host\") pod \"crc-debug-6tq9z\" (UID: \"f5f13896-f36c-4860-9053-8c6a177c21df\") " pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.578081 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjqn7\" (UniqueName: \"kubernetes.io/projected/f5f13896-f36c-4860-9053-8c6a177c21df-kube-api-access-cjqn7\") pod \"crc-debug-6tq9z\" (UID: \"f5f13896-f36c-4860-9053-8c6a177c21df\") " pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.578087 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5f13896-f36c-4860-9053-8c6a177c21df-host\") pod \"crc-debug-6tq9z\" (UID: \"f5f13896-f36c-4860-9053-8c6a177c21df\") " pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.598977 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjqn7\" (UniqueName: \"kubernetes.io/projected/f5f13896-f36c-4860-9053-8c6a177c21df-kube-api-access-cjqn7\") pod \"crc-debug-6tq9z\" (UID: \"f5f13896-f36c-4860-9053-8c6a177c21df\") " pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.768013 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6a5c2d-165b-4f23-9efe-30d89f6915e1" path="/var/lib/kubelet/pods/ef6a5c2d-165b-4f23-9efe-30d89f6915e1/volumes" Oct 09 11:26:57 crc kubenswrapper[4740]: I1009 11:26:57.781275 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:57 crc kubenswrapper[4740]: W1009 11:26:57.817986 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5f13896_f36c_4860_9053_8c6a177c21df.slice/crio-bcd3e405daddb6df0dc93dc87a666569dadc93f74336bfe542991989c30aacee WatchSource:0}: Error finding container bcd3e405daddb6df0dc93dc87a666569dadc93f74336bfe542991989c30aacee: Status 404 returned error can't find the container with id bcd3e405daddb6df0dc93dc87a666569dadc93f74336bfe542991989c30aacee Oct 09 11:26:58 crc kubenswrapper[4740]: I1009 11:26:58.110372 4740 generic.go:334] "Generic (PLEG): container finished" podID="f5f13896-f36c-4860-9053-8c6a177c21df" containerID="6c4f4c08f87a2a71978793ce310662d4d408950f30e46d947e684525d3b8559e" exitCode=0 Oct 09 11:26:58 crc kubenswrapper[4740]: I1009 11:26:58.110430 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" event={"ID":"f5f13896-f36c-4860-9053-8c6a177c21df","Type":"ContainerDied","Data":"6c4f4c08f87a2a71978793ce310662d4d408950f30e46d947e684525d3b8559e"} Oct 09 11:26:58 crc kubenswrapper[4740]: I1009 11:26:58.110531 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" event={"ID":"f5f13896-f36c-4860-9053-8c6a177c21df","Type":"ContainerStarted","Data":"bcd3e405daddb6df0dc93dc87a666569dadc93f74336bfe542991989c30aacee"} Oct 09 11:26:58 crc kubenswrapper[4740]: I1009 11:26:58.158114 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-6tq9z"] Oct 09 11:26:58 crc kubenswrapper[4740]: I1009 11:26:58.167641 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mq5nd/crc-debug-6tq9z"] Oct 09 11:26:59 crc kubenswrapper[4740]: I1009 11:26:59.239400 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:26:59 crc kubenswrapper[4740]: I1009 11:26:59.319144 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5f13896-f36c-4860-9053-8c6a177c21df-host\") pod \"f5f13896-f36c-4860-9053-8c6a177c21df\" (UID: \"f5f13896-f36c-4860-9053-8c6a177c21df\") " Oct 09 11:26:59 crc kubenswrapper[4740]: I1009 11:26:59.319261 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5f13896-f36c-4860-9053-8c6a177c21df-host" (OuterVolumeSpecName: "host") pod "f5f13896-f36c-4860-9053-8c6a177c21df" (UID: "f5f13896-f36c-4860-9053-8c6a177c21df"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 11:26:59 crc kubenswrapper[4740]: I1009 11:26:59.319338 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjqn7\" (UniqueName: \"kubernetes.io/projected/f5f13896-f36c-4860-9053-8c6a177c21df-kube-api-access-cjqn7\") pod \"f5f13896-f36c-4860-9053-8c6a177c21df\" (UID: \"f5f13896-f36c-4860-9053-8c6a177c21df\") " Oct 09 11:26:59 crc kubenswrapper[4740]: I1009 11:26:59.319651 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5f13896-f36c-4860-9053-8c6a177c21df-host\") on node \"crc\" DevicePath \"\"" Oct 09 11:26:59 crc kubenswrapper[4740]: I1009 11:26:59.326529 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f13896-f36c-4860-9053-8c6a177c21df-kube-api-access-cjqn7" (OuterVolumeSpecName: "kube-api-access-cjqn7") pod "f5f13896-f36c-4860-9053-8c6a177c21df" (UID: "f5f13896-f36c-4860-9053-8c6a177c21df"). InnerVolumeSpecName "kube-api-access-cjqn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:26:59 crc kubenswrapper[4740]: I1009 11:26:59.420939 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjqn7\" (UniqueName: \"kubernetes.io/projected/f5f13896-f36c-4860-9053-8c6a177c21df-kube-api-access-cjqn7\") on node \"crc\" DevicePath \"\"" Oct 09 11:26:59 crc kubenswrapper[4740]: I1009 11:26:59.763630 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f13896-f36c-4860-9053-8c6a177c21df" path="/var/lib/kubelet/pods/f5f13896-f36c-4860-9053-8c6a177c21df/volumes" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.134101 4740 scope.go:117] "RemoveContainer" containerID="6c4f4c08f87a2a71978793ce310662d4d408950f30e46d947e684525d3b8559e" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.134157 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/crc-debug-6tq9z" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.297137 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-544757df48-b9dz7_dc9b7872-3887-45a9-8405-506862479e3f/barbican-api/0.log" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.351234 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-544757df48-b9dz7_dc9b7872-3887-45a9-8405-506862479e3f/barbican-api-log/0.log" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.527456 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bd7855d54-lgqzd_72bcd07c-fbd9-44cb-8295-ba498f012009/barbican-keystone-listener/0.log" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.549191 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bd7855d54-lgqzd_72bcd07c-fbd9-44cb-8295-ba498f012009/barbican-keystone-listener-log/0.log" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.659445 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-59d79d879-w5c9m_8db36903-c2ef-429f-97dd-46e98c2a061b/barbican-worker/0.log" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.738261 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-59d79d879-w5c9m_8db36903-c2ef-429f-97dd-46e98c2a061b/barbican-worker-log/0.log" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.873206 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8_3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:00 crc kubenswrapper[4740]: I1009 11:27:00.999248 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5eefd278-fab1-4acc-acca-b6474799e6d1/ceilometer-central-agent/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.045044 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5eefd278-fab1-4acc-acca-b6474799e6d1/ceilometer-notification-agent/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.093986 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5eefd278-fab1-4acc-acca-b6474799e6d1/proxy-httpd/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.116737 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5eefd278-fab1-4acc-acca-b6474799e6d1/sg-core/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.282045 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_56b8e346-13ed-4f64-88af-13be77ceddfa/cinder-api/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.339606 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_56b8e346-13ed-4f64-88af-13be77ceddfa/cinder-api-log/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.479019 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c/cinder-scheduler/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.554023 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c/probe/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.668929 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-wndn5_091c1607-1916-4dfd-9e3d-95dbe5534e98/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.867626 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7_d13c7792-e2d1-4ce2-b965-f77bd77b0cd0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:01 crc kubenswrapper[4740]: I1009 11:27:01.883982 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr_6b62efe3-f320-4b06-9b4f-6cdebea2c83c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.043488 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-sq7qc_a952fe70-b037-4995-a678-b3da7312dcee/init/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.205112 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-sq7qc_a952fe70-b037-4995-a678-b3da7312dcee/init/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.239914 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-sq7qc_a952fe70-b037-4995-a678-b3da7312dcee/dnsmasq-dns/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.291734 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-s79wz_7ced3562-d429-4443-9aa2-82901f4f7797/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.456513 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_83ecd586-6121-4e74-91f1-87267432cc2d/glance-httpd/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.494437 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_83ecd586-6121-4e74-91f1-87267432cc2d/glance-log/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.645700 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5eae963c-dabb-4da9-ac57-86a621088e55/glance-httpd/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.714185 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5eae963c-dabb-4da9-ac57-86a621088e55/glance-log/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.778429 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dd4b95776-lcxbt_3762ae93-7451-4d99-aad4-f9c68666cf40/horizon/0.log" Oct 09 11:27:02 crc kubenswrapper[4740]: I1009 11:27:02.959638 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-h6knn_d611b217-c3b5-49dd-9a5f-acd64171310d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:03 crc kubenswrapper[4740]: I1009 11:27:03.127707 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dd4b95776-lcxbt_3762ae93-7451-4d99-aad4-f9c68666cf40/horizon-log/0.log" Oct 09 11:27:03 crc kubenswrapper[4740]: I1009 11:27:03.306299 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-8fhb7_52c814fd-0700-4e3e-8302-19324617f7c5/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:03 crc kubenswrapper[4740]: I1009 11:27:03.564698 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29333461-s8498_56c17d1b-f3e0-4ca7-ad1f-ac1314036f59/keystone-cron/0.log" Oct 09 11:27:03 crc kubenswrapper[4740]: I1009 11:27:03.571390 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5db569f5cf-ksc2p_d2497d66-a643-4eb4-b69d-725db422cb3a/keystone-api/0.log" Oct 09 11:27:03 crc kubenswrapper[4740]: I1009 11:27:03.743276 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_36a3a627-eea7-4034-a615-38c388851e07/kube-state-metrics/0.log" Oct 09 11:27:03 crc kubenswrapper[4740]: I1009 11:27:03.907307 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nrg72_55748bea-018d-4297-8939-ffec480b42ba/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:04 crc kubenswrapper[4740]: I1009 11:27:04.156854 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d74f6589-zvlln_94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19/neutron-httpd/0.log" Oct 09 11:27:04 crc kubenswrapper[4740]: I1009 11:27:04.184394 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d74f6589-zvlln_94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19/neutron-api/0.log" Oct 09 11:27:04 crc kubenswrapper[4740]: I1009 11:27:04.226798 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj_cdb17de8-f861-4899-8e4d-455cd554cf43/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:04 crc kubenswrapper[4740]: I1009 11:27:04.740174 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_260e0d21-3655-4a0d-a51e-6c483e20c7f5/nova-api-log/0.log" Oct 09 11:27:04 crc kubenswrapper[4740]: I1009 11:27:04.756215 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_305a12c3-450d-43fc-87bb-9bb293438451/nova-cell0-conductor-conductor/0.log" Oct 09 11:27:04 crc kubenswrapper[4740]: I1009 11:27:04.974102 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c11f736a-8dcf-45d6-9f8d-7ff8866458fb/nova-cell1-conductor-conductor/0.log" Oct 09 11:27:04 crc kubenswrapper[4740]: I1009 11:27:04.992179 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_260e0d21-3655-4a0d-a51e-6c483e20c7f5/nova-api-api/0.log" Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.056906 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5e2d4c31-bba4-46d5-8119-b0970e10437d/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.240827 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-frpjr_914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.411990 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.412390 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.416347 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df3b597d-a996-4ebd-b896-61c6c62a0145/nova-metadata-log/0.log" Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.656109 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3bb674db-a6fb-4100-82d2-2fae6660902b/nova-scheduler-scheduler/0.log" Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.680235 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dcd9d52b-8167-47f9-8c36-b75f88119ad5/mysql-bootstrap/0.log" Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.911763 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dcd9d52b-8167-47f9-8c36-b75f88119ad5/mysql-bootstrap/0.log" Oct 09 11:27:05 crc kubenswrapper[4740]: I1009 11:27:05.950721 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dcd9d52b-8167-47f9-8c36-b75f88119ad5/galera/0.log" Oct 09 11:27:06 crc kubenswrapper[4740]: I1009 11:27:06.113171 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3dce8908-af4b-4596-bed2-02788a615207/mysql-bootstrap/0.log" Oct 09 11:27:06 crc kubenswrapper[4740]: I1009 11:27:06.278765 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3dce8908-af4b-4596-bed2-02788a615207/mysql-bootstrap/0.log" Oct 09 11:27:06 crc kubenswrapper[4740]: I1009 11:27:06.344602 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3dce8908-af4b-4596-bed2-02788a615207/galera/0.log" Oct 09 11:27:06 crc kubenswrapper[4740]: I1009 11:27:06.462493 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_efa49127-ef96-4ed6-8b72-c106e5575707/openstackclient/0.log" Oct 09 11:27:06 crc kubenswrapper[4740]: I1009 11:27:06.480673 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df3b597d-a996-4ebd-b896-61c6c62a0145/nova-metadata-metadata/0.log" Oct 09 11:27:06 crc kubenswrapper[4740]: I1009 11:27:06.605168 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-c6rld_7f56ff38-de3a-4c48-8fc0-43e0eac26c55/ovn-controller/0.log" Oct 09 11:27:06 crc kubenswrapper[4740]: I1009 11:27:06.717709 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pn5fj_94fc74ef-6b90-4c9a-9da5-d7eb116a7806/openstack-network-exporter/0.log" Oct 09 11:27:06 crc kubenswrapper[4740]: I1009 11:27:06.793696 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cwdss_5a1841e0-a15d-4dca-a1a4-6b50f338ddbc/ovsdb-server-init/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.000548 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cwdss_5a1841e0-a15d-4dca-a1a4-6b50f338ddbc/ovsdb-server-init/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.045952 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cwdss_5a1841e0-a15d-4dca-a1a4-6b50f338ddbc/ovs-vswitchd/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.101520 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cwdss_5a1841e0-a15d-4dca-a1a4-6b50f338ddbc/ovsdb-server/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.287129 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hd9p5_72061500-62b1-404d-8def-280fcca2e73f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.313921 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eda487bd-e994-4fce-86f9-50e85aaf30b2/openstack-network-exporter/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.348708 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eda487bd-e994-4fce-86f9-50e85aaf30b2/ovn-northd/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.502214 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0c37175d-6801-461a-82ba-ea611afdaebf/openstack-network-exporter/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.544644 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0c37175d-6801-461a-82ba-ea611afdaebf/ovsdbserver-nb/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.667679 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_52b13ae3-8184-4ea2-a6b5-14d739b1200e/openstack-network-exporter/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.750406 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_52b13ae3-8184-4ea2-a6b5-14d739b1200e/ovsdbserver-sb/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.860026 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55c77867db-hsc8q_48968716-1198-429f-90f0-ab6663baaed5/placement-api/0.log" Oct 09 11:27:07 crc kubenswrapper[4740]: I1009 11:27:07.941372 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55c77867db-hsc8q_48968716-1198-429f-90f0-ab6663baaed5/placement-log/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.016531 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46/setup-container/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.264871 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46/rabbitmq/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.265238 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46/setup-container/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.305226 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ff4b6585-91c6-48f8-ba40-5cd075c7c59e/setup-container/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.547195 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg_3f30a224-f5af-498e-97f3-28a5a26f9884/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.560130 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ff4b6585-91c6-48f8-ba40-5cd075c7c59e/rabbitmq/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.573600 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ff4b6585-91c6-48f8-ba40-5cd075c7c59e/setup-container/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.785398 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4jjwr_1bb88dfa-ffc6-433a-9df2-f00e2a6805e7/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.876975 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s_945008af-c262-4581-8f40-51b8fe5a9dd8/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:08 crc kubenswrapper[4740]: I1009 11:27:08.988290 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-w8hc6_c79d4035-1be0-44ff-9ddd-0a65a54be7ed/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.130450 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9vbmb_0cc7d46b-528d-415b-a1cf-34ea3e4483b5/ssh-known-hosts-edpm-deployment/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.321077 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66569d88ff-tjljh_501b9024-4f9f-41eb-ae73-d9ecb0637363/proxy-server/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.365999 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66569d88ff-tjljh_501b9024-4f9f-41eb-ae73-d9ecb0637363/proxy-httpd/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.416820 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vzk5q_ebeb8396-40be-4400-8a2f-d1cdeb8c20e4/swift-ring-rebalance/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.583767 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/account-reaper/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.599803 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/account-auditor/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.643044 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/account-replicator/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.820158 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/container-auditor/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.825319 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/account-server/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.842365 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/container-replicator/0.log" Oct 09 11:27:09 crc kubenswrapper[4740]: I1009 11:27:09.856940 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/container-server/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.008432 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/container-updater/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.054091 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-expirer/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.066393 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-replicator/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.101462 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-auditor/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.231880 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-server/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.241099 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-updater/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.334633 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/rsync/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.357223 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/swift-recon-cron/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.487514 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-ggkft_40e8133a-5380-4983-a96f-8f28d50108a9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.615156 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4c1a2aba-0872-4bef-9bad-0ba37788423d/tempest-tests-tempest-tests-runner/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.731864 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0486d8ef-3b23-4d76-9764-a3d48c174482/test-operator-logs-container/0.log" Oct 09 11:27:10 crc kubenswrapper[4740]: I1009 11:27:10.878073 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq_19338c28-ee36-4273-8f74-f34767a3fcb1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:27:20 crc kubenswrapper[4740]: I1009 11:27:20.061198 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f16fec59-73b1-4b57-ab47-c1767c6c2a7d/memcached/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.076563 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-jqnnv_93f4faa8-4d5e-48d9-ac5a-bb1468f972d3/kube-rbac-proxy/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.169899 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-jqnnv_93f4faa8-4d5e-48d9-ac5a-bb1468f972d3/manager/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.244201 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-mp2p4_95b86671-972c-4a57-b68b-0421b82bd3d4/kube-rbac-proxy/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.336388 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-mp2p4_95b86671-972c-4a57-b68b-0421b82bd3d4/manager/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.436723 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/util/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.613028 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/pull/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.620811 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/util/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.626247 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/pull/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.779034 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/pull/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.792924 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/util/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.800234 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/extract/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.962738 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-w2ftw_2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8/kube-rbac-proxy/0.log" Oct 09 11:27:34 crc kubenswrapper[4740]: I1009 11:27:34.970412 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-w2ftw_2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8/manager/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.000859 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-268g9_a3c08e43-cc8b-433e-ba8e-fd225eef09ed/kube-rbac-proxy/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.176207 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-sf7cf_1519e3af-34c9-4722-9aaa-8a10ef0d49de/kube-rbac-proxy/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.194539 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-268g9_a3c08e43-cc8b-433e-ba8e-fd225eef09ed/manager/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.219932 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-sf7cf_1519e3af-34c9-4722-9aaa-8a10ef0d49de/manager/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.368925 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-p4btw_5348e551-de55-4c32-af1e-ac9facc061d9/kube-rbac-proxy/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.384088 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-p4btw_5348e551-de55-4c32-af1e-ac9facc061d9/manager/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.408104 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.408152 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.408196 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.409083 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.409143 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" gracePeriod=600 Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.543663 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-4zsvx_27b8cb71-5bd2-4133-bf5a-db571521861b/kube-rbac-proxy/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: E1009 11:27:35.566196 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.643155 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-prf5f_8ae60958-f755-47fd-891b-74356bff787c/kube-rbac-proxy/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.700081 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-4zsvx_27b8cb71-5bd2-4133-bf5a-db571521861b/manager/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.760284 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-prf5f_8ae60958-f755-47fd-891b-74356bff787c/manager/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.849977 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hsl94_67fd364b-d05e-4d57-a817-3f64be5cdba0/kube-rbac-proxy/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.959931 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hsl94_67fd364b-d05e-4d57-a817-3f64be5cdba0/manager/0.log" Oct 09 11:27:35 crc kubenswrapper[4740]: I1009 11:27:35.965995 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-6jtst_f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1/kube-rbac-proxy/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.050920 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-6jtst_f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1/manager/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.123101 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-cz2dz_8313eb28-2711-404c-817c-b782ea1cf41a/manager/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.131502 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-cz2dz_8313eb28-2711-404c-817c-b782ea1cf41a/kube-rbac-proxy/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.278141 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-h4lw2_254d742d-881a-4ea9-97fd-2246d7109a77/kube-rbac-proxy/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.316768 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-h4lw2_254d742d-881a-4ea9-97fd-2246d7109a77/manager/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.435200 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" exitCode=0 Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.435245 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930"} Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.435279 4740 scope.go:117] "RemoveContainer" containerID="51918591255fd9bc84070769a7fb279ef80f14e662d5784322c40cfbec726c46" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.435972 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:27:36 crc kubenswrapper[4740]: E1009 11:27:36.436317 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.437015 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-kv5jg_676e4e26-21ec-4b2c-ab3f-bc593cddfb33/kube-rbac-proxy/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.493353 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-8nkmb_a9f75d3c-e107-48aa-b15b-442b785b8945/kube-rbac-proxy/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.545979 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-kv5jg_676e4e26-21ec-4b2c-ab3f-bc593cddfb33/manager/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.684577 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-8nkmb_a9f75d3c-e107-48aa-b15b-442b785b8945/manager/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.710993 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx_3179f3c7-2f14-494b-9fea-3c217a11af2b/kube-rbac-proxy/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.745287 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx_3179f3c7-2f14-494b-9fea-3c217a11af2b/manager/0.log" Oct 09 11:27:36 crc kubenswrapper[4740]: I1009 11:27:36.876217 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5647484f69-cxbqt_cd1eb7dc-bc88-4a8e-b681-751ebdf2089f/kube-rbac-proxy/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.087937 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6c6ccc6df6-wxp9l_e0e26b11-7270-46ec-9042-0eaab1e2a459/kube-rbac-proxy/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.199815 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6c6ccc6df6-wxp9l_e0e26b11-7270-46ec-9042-0eaab1e2a459/operator/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.288726 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gvdpg_aad20335-936b-4ec4-aced-424bf31edf74/registry-server/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.494978 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-r494n_7e7f599f-1cc9-41fc-b683-8b0de6e48761/kube-rbac-proxy/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.562850 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-r494n_7e7f599f-1cc9-41fc-b683-8b0de6e48761/manager/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.680947 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-q5tlk_f1909d9f-c6e3-4c55-93f5-be679e3c3792/kube-rbac-proxy/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.784350 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-q5tlk_f1909d9f-c6e3-4c55-93f5-be679e3c3792/manager/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.815434 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g_00c4b19b-1c03-4fc2-9ac1-39ca45ca9570/operator/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.900414 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5647484f69-cxbqt_cd1eb7dc-bc88-4a8e-b681-751ebdf2089f/manager/0.log" Oct 09 11:27:37 crc kubenswrapper[4740]: I1009 11:27:37.943212 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-gg5c4_1a6180f0-55bd-4c7e-a96c-97762cace534/kube-rbac-proxy/0.log" Oct 09 11:27:38 crc kubenswrapper[4740]: I1009 11:27:38.022068 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-gg5c4_1a6180f0-55bd-4c7e-a96c-97762cace534/manager/0.log" Oct 09 11:27:38 crc kubenswrapper[4740]: I1009 11:27:38.143263 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-6fng6_594a8f26-8acc-44a8-b024-665012e570f6/kube-rbac-proxy/0.log" Oct 09 11:27:38 crc kubenswrapper[4740]: I1009 11:27:38.144300 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-6fng6_594a8f26-8acc-44a8-b024-665012e570f6/manager/0.log" Oct 09 11:27:38 crc kubenswrapper[4740]: I1009 11:27:38.201368 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-mxz8d_d454e0e1-1745-4fc0-aea1-9d231de7fa65/kube-rbac-proxy/0.log" Oct 09 11:27:38 crc kubenswrapper[4740]: I1009 11:27:38.239225 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-mxz8d_d454e0e1-1745-4fc0-aea1-9d231de7fa65/manager/0.log" Oct 09 11:27:38 crc kubenswrapper[4740]: I1009 11:27:38.327006 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-ngw9n_65d851bd-9407-48da-bac4-d3b07bab1d46/kube-rbac-proxy/0.log" Oct 09 11:27:38 crc kubenswrapper[4740]: I1009 11:27:38.369228 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-ngw9n_65d851bd-9407-48da-bac4-d3b07bab1d46/manager/0.log" Oct 09 11:27:51 crc kubenswrapper[4740]: I1009 11:27:51.785989 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:27:51 crc kubenswrapper[4740]: E1009 11:27:51.786982 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:27:54 crc kubenswrapper[4740]: I1009 11:27:54.683939 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mr7wc_91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b/control-plane-machine-set-operator/0.log" Oct 09 11:27:54 crc kubenswrapper[4740]: I1009 11:27:54.854526 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pxg57_2cfbd3fb-f7f5-4578-9e24-72dbd185cf12/kube-rbac-proxy/0.log" Oct 09 11:27:54 crc kubenswrapper[4740]: I1009 11:27:54.873100 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pxg57_2cfbd3fb-f7f5-4578-9e24-72dbd185cf12/machine-api-operator/0.log" Oct 09 11:28:05 crc kubenswrapper[4740]: I1009 11:28:05.754276 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:28:05 crc kubenswrapper[4740]: E1009 11:28:05.756184 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:28:07 crc kubenswrapper[4740]: I1009 11:28:07.413878 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vsq7z_9b5b17c6-4d72-4295-bb2b-436b65625a66/cert-manager-controller/0.log" Oct 09 11:28:07 crc kubenswrapper[4740]: I1009 11:28:07.533960 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-hlrtz_7d2a1d30-c83b-41ce-839e-3eb1f655a1c3/cert-manager-cainjector/0.log" Oct 09 11:28:07 crc kubenswrapper[4740]: I1009 11:28:07.585348 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-rdl5l_a9ffc41f-4710-469a-bae3-ae15d4eafd9b/cert-manager-webhook/0.log" Oct 09 11:28:19 crc kubenswrapper[4740]: I1009 11:28:19.877424 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-5x42v_edfeadfc-4f2b-4004-9a2d-98b6b8bbe448/nmstate-console-plugin/0.log" Oct 09 11:28:20 crc kubenswrapper[4740]: I1009 11:28:20.058082 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-f9gt2_9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d/nmstate-handler/0.log" Oct 09 11:28:20 crc kubenswrapper[4740]: I1009 11:28:20.120208 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-fvr8l_eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd/nmstate-metrics/0.log" Oct 09 11:28:20 crc kubenswrapper[4740]: I1009 11:28:20.134853 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-fvr8l_eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd/kube-rbac-proxy/0.log" Oct 09 11:28:20 crc kubenswrapper[4740]: I1009 11:28:20.286332 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-8f6tc_a286b66d-1660-424c-b244-d889a099262c/nmstate-operator/0.log" Oct 09 11:28:20 crc kubenswrapper[4740]: I1009 11:28:20.347311 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-bslx4_1f9567a2-9e5d-4996-a625-1bcaca30d9a9/nmstate-webhook/0.log" Oct 09 11:28:20 crc kubenswrapper[4740]: I1009 11:28:20.753577 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:28:20 crc kubenswrapper[4740]: E1009 11:28:20.753933 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:28:33 crc kubenswrapper[4740]: I1009 11:28:33.757086 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:28:33 crc kubenswrapper[4740]: E1009 11:28:33.757956 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:28:34 crc kubenswrapper[4740]: I1009 11:28:34.979552 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-8dwvb_e58ad5c0-164a-4ed4-b665-44068078198c/kube-rbac-proxy/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.062360 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-8dwvb_e58ad5c0-164a-4ed4-b665-44068078198c/controller/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.143194 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-frr-files/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.345165 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-metrics/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.347068 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-frr-files/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.358516 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-reloader/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.411123 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-reloader/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.570518 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-metrics/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.603212 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-frr-files/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.604838 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-metrics/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.639424 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-reloader/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.758969 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-frr-files/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.809055 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-metrics/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.820725 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-reloader/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.823891 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/controller/0.log" Oct 09 11:28:35 crc kubenswrapper[4740]: I1009 11:28:35.980875 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/frr-metrics/0.log" Oct 09 11:28:36 crc kubenswrapper[4740]: I1009 11:28:36.004641 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/kube-rbac-proxy/0.log" Oct 09 11:28:36 crc kubenswrapper[4740]: I1009 11:28:36.007168 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/kube-rbac-proxy-frr/0.log" Oct 09 11:28:36 crc kubenswrapper[4740]: I1009 11:28:36.148942 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/reloader/0.log" Oct 09 11:28:36 crc kubenswrapper[4740]: I1009 11:28:36.257605 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-76cjs_01117083-3a07-4afc-b678-11b52fd9edea/frr-k8s-webhook-server/0.log" Oct 09 11:28:36 crc kubenswrapper[4740]: I1009 11:28:36.453958 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7698d6f4f4-hrnng_97758f55-d70a-4949-b056-5673a1975dd5/manager/0.log" Oct 09 11:28:36 crc kubenswrapper[4740]: I1009 11:28:36.601123 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6749d4b858-4656g_9f363b47-14ab-4719-93dd-db269dc8f132/webhook-server/0.log" Oct 09 11:28:36 crc kubenswrapper[4740]: I1009 11:28:36.745884 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qzfv2_3bb4221a-8b49-4183-a53b-6f81deafb446/kube-rbac-proxy/0.log" Oct 09 11:28:37 crc kubenswrapper[4740]: I1009 11:28:37.263438 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qzfv2_3bb4221a-8b49-4183-a53b-6f81deafb446/speaker/0.log" Oct 09 11:28:37 crc kubenswrapper[4740]: I1009 11:28:37.312208 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/frr/0.log" Oct 09 11:28:44 crc kubenswrapper[4740]: I1009 11:28:44.754480 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:28:44 crc kubenswrapper[4740]: E1009 11:28:44.755139 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:28:48 crc kubenswrapper[4740]: I1009 11:28:48.927099 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/util/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.046414 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/util/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.085600 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/pull/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.112092 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/pull/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.246328 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/util/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.267142 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/extract/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.267457 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/pull/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.422428 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-utilities/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.561060 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-utilities/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.574936 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-content/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.578885 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-content/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.750418 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-content/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.783725 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-utilities/0.log" Oct 09 11:28:49 crc kubenswrapper[4740]: I1009 11:28:49.992047 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-utilities/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.132730 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/registry-server/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.237779 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-content/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.237808 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-content/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.247830 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-utilities/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.368495 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-utilities/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.409382 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-content/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.568221 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/util/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.748571 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/util/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.790270 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/pull/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.794966 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/pull/0.log" Oct 09 11:28:50 crc kubenswrapper[4740]: I1009 11:28:50.937857 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/registry-server/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.026490 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/extract/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.053218 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/pull/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.086740 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/util/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.225916 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zlsxx_26400837-285d-412b-944c-5b1fcb42b34f/marketplace-operator/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.283790 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-utilities/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.464918 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-content/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.512412 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-content/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.519646 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-utilities/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.705618 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-content/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.712988 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-utilities/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.862981 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/registry-server/0.log" Oct 09 11:28:51 crc kubenswrapper[4740]: I1009 11:28:51.924121 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-utilities/0.log" Oct 09 11:28:52 crc kubenswrapper[4740]: I1009 11:28:52.045572 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-utilities/0.log" Oct 09 11:28:52 crc kubenswrapper[4740]: I1009 11:28:52.070199 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-content/0.log" Oct 09 11:28:52 crc kubenswrapper[4740]: I1009 11:28:52.093965 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-content/0.log" Oct 09 11:28:52 crc kubenswrapper[4740]: I1009 11:28:52.266058 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-utilities/0.log" Oct 09 11:28:52 crc kubenswrapper[4740]: I1009 11:28:52.289104 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-content/0.log" Oct 09 11:28:52 crc kubenswrapper[4740]: I1009 11:28:52.753362 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/registry-server/0.log" Oct 09 11:28:55 crc kubenswrapper[4740]: I1009 11:28:55.759389 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:28:55 crc kubenswrapper[4740]: E1009 11:28:55.761027 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:29:08 crc kubenswrapper[4740]: I1009 11:29:08.753965 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:29:08 crc kubenswrapper[4740]: E1009 11:29:08.755034 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:29:23 crc kubenswrapper[4740]: I1009 11:29:23.754821 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:29:23 crc kubenswrapper[4740]: E1009 11:29:23.756347 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:29:34 crc kubenswrapper[4740]: I1009 11:29:34.755959 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:29:34 crc kubenswrapper[4740]: E1009 11:29:34.756966 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:29:45 crc kubenswrapper[4740]: I1009 11:29:45.753732 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:29:45 crc kubenswrapper[4740]: E1009 11:29:45.754535 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.485998 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2pkxt"] Oct 09 11:29:53 crc kubenswrapper[4740]: E1009 11:29:53.492165 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f13896-f36c-4860-9053-8c6a177c21df" containerName="container-00" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.492205 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f13896-f36c-4860-9053-8c6a177c21df" containerName="container-00" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.492478 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f13896-f36c-4860-9053-8c6a177c21df" containerName="container-00" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.496913 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pkxt"] Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.497074 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.709689 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9mxj\" (UniqueName: \"kubernetes.io/projected/5a6c75db-559d-4548-9d5c-db41f9342e9d-kube-api-access-r9mxj\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.709802 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-utilities\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.710064 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-catalog-content\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.811465 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-catalog-content\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.811584 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9mxj\" (UniqueName: \"kubernetes.io/projected/5a6c75db-559d-4548-9d5c-db41f9342e9d-kube-api-access-r9mxj\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.811638 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-utilities\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.812586 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-utilities\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.813390 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-catalog-content\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:53 crc kubenswrapper[4740]: I1009 11:29:53.835389 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9mxj\" (UniqueName: \"kubernetes.io/projected/5a6c75db-559d-4548-9d5c-db41f9342e9d-kube-api-access-r9mxj\") pod \"redhat-operators-2pkxt\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:54 crc kubenswrapper[4740]: I1009 11:29:54.127004 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:29:54 crc kubenswrapper[4740]: I1009 11:29:54.622002 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pkxt"] Oct 09 11:29:54 crc kubenswrapper[4740]: I1009 11:29:54.824829 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pkxt" event={"ID":"5a6c75db-559d-4548-9d5c-db41f9342e9d","Type":"ContainerStarted","Data":"7ed6805ec909da924d7d2f6b064a42505f476e7408d1135f008db3215b054aac"} Oct 09 11:29:55 crc kubenswrapper[4740]: I1009 11:29:55.851088 4740 generic.go:334] "Generic (PLEG): container finished" podID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerID="e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad" exitCode=0 Oct 09 11:29:55 crc kubenswrapper[4740]: I1009 11:29:55.851160 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pkxt" event={"ID":"5a6c75db-559d-4548-9d5c-db41f9342e9d","Type":"ContainerDied","Data":"e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad"} Oct 09 11:29:55 crc kubenswrapper[4740]: I1009 11:29:55.858274 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 11:29:57 crc kubenswrapper[4740]: I1009 11:29:57.872644 4740 generic.go:334] "Generic (PLEG): container finished" podID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerID="cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54" exitCode=0 Oct 09 11:29:57 crc kubenswrapper[4740]: I1009 11:29:57.872801 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pkxt" event={"ID":"5a6c75db-559d-4548-9d5c-db41f9342e9d","Type":"ContainerDied","Data":"cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54"} Oct 09 11:29:58 crc kubenswrapper[4740]: I1009 11:29:58.932655 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2pkxt" podStartSLOduration=3.150959093 podStartE2EDuration="5.932625572s" podCreationTimestamp="2025-10-09 11:29:53 +0000 UTC" firstStartedPulling="2025-10-09 11:29:55.858015031 +0000 UTC m=+3734.820215412" lastFinishedPulling="2025-10-09 11:29:58.63968151 +0000 UTC m=+3737.601881891" observedRunningTime="2025-10-09 11:29:58.914231428 +0000 UTC m=+3737.876431849" watchObservedRunningTime="2025-10-09 11:29:58.932625572 +0000 UTC m=+3737.894825993" Oct 09 11:29:59 crc kubenswrapper[4740]: I1009 11:29:59.758573 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:29:59 crc kubenswrapper[4740]: E1009 11:29:59.760313 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:29:59 crc kubenswrapper[4740]: I1009 11:29:59.906160 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pkxt" event={"ID":"5a6c75db-559d-4548-9d5c-db41f9342e9d","Type":"ContainerStarted","Data":"a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf"} Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.185376 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl"] Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.187316 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.190893 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.194487 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.197248 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl"] Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.236529 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d63a132-0bdd-4fe3-bab0-af9accc67261-secret-volume\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.236586 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7p72\" (UniqueName: \"kubernetes.io/projected/5d63a132-0bdd-4fe3-bab0-af9accc67261-kube-api-access-j7p72\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.236694 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d63a132-0bdd-4fe3-bab0-af9accc67261-config-volume\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.337650 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d63a132-0bdd-4fe3-bab0-af9accc67261-config-volume\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.337718 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d63a132-0bdd-4fe3-bab0-af9accc67261-secret-volume\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.337764 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7p72\" (UniqueName: \"kubernetes.io/projected/5d63a132-0bdd-4fe3-bab0-af9accc67261-kube-api-access-j7p72\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.338650 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d63a132-0bdd-4fe3-bab0-af9accc67261-config-volume\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.352883 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d63a132-0bdd-4fe3-bab0-af9accc67261-secret-volume\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.362919 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7p72\" (UniqueName: \"kubernetes.io/projected/5d63a132-0bdd-4fe3-bab0-af9accc67261-kube-api-access-j7p72\") pod \"collect-profiles-29333490-95vnl\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:00 crc kubenswrapper[4740]: I1009 11:30:00.506524 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:01 crc kubenswrapper[4740]: I1009 11:30:00.978696 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl"] Oct 09 11:30:01 crc kubenswrapper[4740]: I1009 11:30:01.931811 4740 generic.go:334] "Generic (PLEG): container finished" podID="5d63a132-0bdd-4fe3-bab0-af9accc67261" containerID="341d5936e0100daa1f95e16ce5515e11d32638150c8084d5e72b4f4fc74e093c" exitCode=0 Oct 09 11:30:01 crc kubenswrapper[4740]: I1009 11:30:01.931879 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" event={"ID":"5d63a132-0bdd-4fe3-bab0-af9accc67261","Type":"ContainerDied","Data":"341d5936e0100daa1f95e16ce5515e11d32638150c8084d5e72b4f4fc74e093c"} Oct 09 11:30:01 crc kubenswrapper[4740]: I1009 11:30:01.932306 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" event={"ID":"5d63a132-0bdd-4fe3-bab0-af9accc67261","Type":"ContainerStarted","Data":"a9917d83c31a478de3fc55dd8d3cdd35965eaa0d4ecc0b3eaa0991a50c5642fa"} Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.303178 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.402922 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7p72\" (UniqueName: \"kubernetes.io/projected/5d63a132-0bdd-4fe3-bab0-af9accc67261-kube-api-access-j7p72\") pod \"5d63a132-0bdd-4fe3-bab0-af9accc67261\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.403356 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d63a132-0bdd-4fe3-bab0-af9accc67261-secret-volume\") pod \"5d63a132-0bdd-4fe3-bab0-af9accc67261\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.403391 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d63a132-0bdd-4fe3-bab0-af9accc67261-config-volume\") pod \"5d63a132-0bdd-4fe3-bab0-af9accc67261\" (UID: \"5d63a132-0bdd-4fe3-bab0-af9accc67261\") " Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.404067 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d63a132-0bdd-4fe3-bab0-af9accc67261-config-volume" (OuterVolumeSpecName: "config-volume") pod "5d63a132-0bdd-4fe3-bab0-af9accc67261" (UID: "5d63a132-0bdd-4fe3-bab0-af9accc67261"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.408697 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d63a132-0bdd-4fe3-bab0-af9accc67261-kube-api-access-j7p72" (OuterVolumeSpecName: "kube-api-access-j7p72") pod "5d63a132-0bdd-4fe3-bab0-af9accc67261" (UID: "5d63a132-0bdd-4fe3-bab0-af9accc67261"). InnerVolumeSpecName "kube-api-access-j7p72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.410370 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d63a132-0bdd-4fe3-bab0-af9accc67261-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5d63a132-0bdd-4fe3-bab0-af9accc67261" (UID: "5d63a132-0bdd-4fe3-bab0-af9accc67261"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.505602 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d63a132-0bdd-4fe3-bab0-af9accc67261-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.505641 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d63a132-0bdd-4fe3-bab0-af9accc67261-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.505654 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7p72\" (UniqueName: \"kubernetes.io/projected/5d63a132-0bdd-4fe3-bab0-af9accc67261-kube-api-access-j7p72\") on node \"crc\" DevicePath \"\"" Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.951468 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" event={"ID":"5d63a132-0bdd-4fe3-bab0-af9accc67261","Type":"ContainerDied","Data":"a9917d83c31a478de3fc55dd8d3cdd35965eaa0d4ecc0b3eaa0991a50c5642fa"} Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.951501 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9917d83c31a478de3fc55dd8d3cdd35965eaa0d4ecc0b3eaa0991a50c5642fa" Oct 09 11:30:03 crc kubenswrapper[4740]: I1009 11:30:03.951545 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333490-95vnl" Oct 09 11:30:04 crc kubenswrapper[4740]: I1009 11:30:04.127645 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:30:04 crc kubenswrapper[4740]: I1009 11:30:04.127704 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:30:04 crc kubenswrapper[4740]: I1009 11:30:04.205254 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:30:04 crc kubenswrapper[4740]: I1009 11:30:04.388706 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q"] Oct 09 11:30:04 crc kubenswrapper[4740]: I1009 11:30:04.402037 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333445-lhw2q"] Oct 09 11:30:05 crc kubenswrapper[4740]: I1009 11:30:05.374444 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:30:05 crc kubenswrapper[4740]: I1009 11:30:05.422811 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pkxt"] Oct 09 11:30:05 crc kubenswrapper[4740]: I1009 11:30:05.775551 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2692046-9849-4c7f-a506-5767b57dcc85" path="/var/lib/kubelet/pods/c2692046-9849-4c7f-a506-5767b57dcc85/volumes" Oct 09 11:30:06 crc kubenswrapper[4740]: I1009 11:30:06.992177 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2pkxt" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerName="registry-server" containerID="cri-o://a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf" gracePeriod=2 Oct 09 11:30:07 crc kubenswrapper[4740]: I1009 11:30:07.979337 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.003195 4740 generic.go:334] "Generic (PLEG): container finished" podID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerID="a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf" exitCode=0 Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.003237 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pkxt" event={"ID":"5a6c75db-559d-4548-9d5c-db41f9342e9d","Type":"ContainerDied","Data":"a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf"} Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.003264 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pkxt" event={"ID":"5a6c75db-559d-4548-9d5c-db41f9342e9d","Type":"ContainerDied","Data":"7ed6805ec909da924d7d2f6b064a42505f476e7408d1135f008db3215b054aac"} Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.003284 4740 scope.go:117] "RemoveContainer" containerID="a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.003415 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pkxt" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.026629 4740 scope.go:117] "RemoveContainer" containerID="cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.057515 4740 scope.go:117] "RemoveContainer" containerID="e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.097651 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-catalog-content\") pod \"5a6c75db-559d-4548-9d5c-db41f9342e9d\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.097729 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9mxj\" (UniqueName: \"kubernetes.io/projected/5a6c75db-559d-4548-9d5c-db41f9342e9d-kube-api-access-r9mxj\") pod \"5a6c75db-559d-4548-9d5c-db41f9342e9d\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.097876 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-utilities\") pod \"5a6c75db-559d-4548-9d5c-db41f9342e9d\" (UID: \"5a6c75db-559d-4548-9d5c-db41f9342e9d\") " Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.099063 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-utilities" (OuterVolumeSpecName: "utilities") pod "5a6c75db-559d-4548-9d5c-db41f9342e9d" (UID: "5a6c75db-559d-4548-9d5c-db41f9342e9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.105374 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6c75db-559d-4548-9d5c-db41f9342e9d-kube-api-access-r9mxj" (OuterVolumeSpecName: "kube-api-access-r9mxj") pod "5a6c75db-559d-4548-9d5c-db41f9342e9d" (UID: "5a6c75db-559d-4548-9d5c-db41f9342e9d"). InnerVolumeSpecName "kube-api-access-r9mxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.113326 4740 scope.go:117] "RemoveContainer" containerID="a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf" Oct 09 11:30:08 crc kubenswrapper[4740]: E1009 11:30:08.113934 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf\": container with ID starting with a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf not found: ID does not exist" containerID="a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.114027 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf"} err="failed to get container status \"a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf\": rpc error: code = NotFound desc = could not find container \"a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf\": container with ID starting with a5bdbadc97b56088da606957ea289926a96438ab6a448563a1f35e0691695dbf not found: ID does not exist" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.114099 4740 scope.go:117] "RemoveContainer" containerID="cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54" Oct 09 11:30:08 crc kubenswrapper[4740]: E1009 11:30:08.114491 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54\": container with ID starting with cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54 not found: ID does not exist" containerID="cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.114563 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54"} err="failed to get container status \"cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54\": rpc error: code = NotFound desc = could not find container \"cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54\": container with ID starting with cf1a3d156ef6e60bd8b50212f585e7344170a01cf36b44b77d2ab48a8b5a3f54 not found: ID does not exist" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.114635 4740 scope.go:117] "RemoveContainer" containerID="e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad" Oct 09 11:30:08 crc kubenswrapper[4740]: E1009 11:30:08.115241 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad\": container with ID starting with e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad not found: ID does not exist" containerID="e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.115285 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad"} err="failed to get container status \"e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad\": rpc error: code = NotFound desc = could not find container \"e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad\": container with ID starting with e16d4350b983235f091e5ae307ea5bb51e618ab9d8c91c9daf183f9a1ff07dad not found: ID does not exist" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.188748 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a6c75db-559d-4548-9d5c-db41f9342e9d" (UID: "5a6c75db-559d-4548-9d5c-db41f9342e9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.201200 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.201251 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6c75db-559d-4548-9d5c-db41f9342e9d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.201266 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9mxj\" (UniqueName: \"kubernetes.io/projected/5a6c75db-559d-4548-9d5c-db41f9342e9d-kube-api-access-r9mxj\") on node \"crc\" DevicePath \"\"" Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.345774 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pkxt"] Oct 09 11:30:08 crc kubenswrapper[4740]: I1009 11:30:08.355866 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2pkxt"] Oct 09 11:30:09 crc kubenswrapper[4740]: I1009 11:30:09.766290 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" path="/var/lib/kubelet/pods/5a6c75db-559d-4548-9d5c-db41f9342e9d/volumes" Oct 09 11:30:12 crc kubenswrapper[4740]: I1009 11:30:12.754371 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:30:12 crc kubenswrapper[4740]: E1009 11:30:12.755091 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:30:13 crc kubenswrapper[4740]: I1009 11:30:13.511804 4740 scope.go:117] "RemoveContainer" containerID="5b4fce3a7571db0590af42395ac520a1ac480a36e1744b04d075202d1a3f6fe3" Oct 09 11:30:23 crc kubenswrapper[4740]: I1009 11:30:23.754110 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:30:23 crc kubenswrapper[4740]: E1009 11:30:23.757052 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:30:29 crc kubenswrapper[4740]: I1009 11:30:29.227236 4740 generic.go:334] "Generic (PLEG): container finished" podID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerID="56c4518f771884721960d702faf5b34d5899cb6efcd70ffa8f72aef91632b1e1" exitCode=0 Oct 09 11:30:29 crc kubenswrapper[4740]: I1009 11:30:29.227339 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mq5nd/must-gather-kszhj" event={"ID":"9669ea77-a286-4fe0-8a3f-26653ca161e5","Type":"ContainerDied","Data":"56c4518f771884721960d702faf5b34d5899cb6efcd70ffa8f72aef91632b1e1"} Oct 09 11:30:29 crc kubenswrapper[4740]: I1009 11:30:29.228329 4740 scope.go:117] "RemoveContainer" containerID="56c4518f771884721960d702faf5b34d5899cb6efcd70ffa8f72aef91632b1e1" Oct 09 11:30:30 crc kubenswrapper[4740]: I1009 11:30:30.156603 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mq5nd_must-gather-kszhj_9669ea77-a286-4fe0-8a3f-26653ca161e5/gather/0.log" Oct 09 11:30:35 crc kubenswrapper[4740]: I1009 11:30:35.754164 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:30:35 crc kubenswrapper[4740]: E1009 11:30:35.755024 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:30:37 crc kubenswrapper[4740]: I1009 11:30:37.973828 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mq5nd/must-gather-kszhj"] Oct 09 11:30:37 crc kubenswrapper[4740]: I1009 11:30:37.974885 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mq5nd/must-gather-kszhj" podUID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerName="copy" containerID="cri-o://228aae2a2727c17fc5a8fcee31820460e04dc56143f6e736396da3e1663b9a4a" gracePeriod=2 Oct 09 11:30:37 crc kubenswrapper[4740]: I1009 11:30:37.984644 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mq5nd/must-gather-kszhj"] Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.317240 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mq5nd_must-gather-kszhj_9669ea77-a286-4fe0-8a3f-26653ca161e5/copy/0.log" Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.318371 4740 generic.go:334] "Generic (PLEG): container finished" podID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerID="228aae2a2727c17fc5a8fcee31820460e04dc56143f6e736396da3e1663b9a4a" exitCode=143 Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.407192 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mq5nd_must-gather-kszhj_9669ea77-a286-4fe0-8a3f-26653ca161e5/copy/0.log" Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.408084 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.487062 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9669ea77-a286-4fe0-8a3f-26653ca161e5-must-gather-output\") pod \"9669ea77-a286-4fe0-8a3f-26653ca161e5\" (UID: \"9669ea77-a286-4fe0-8a3f-26653ca161e5\") " Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.487165 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvtlq\" (UniqueName: \"kubernetes.io/projected/9669ea77-a286-4fe0-8a3f-26653ca161e5-kube-api-access-cvtlq\") pod \"9669ea77-a286-4fe0-8a3f-26653ca161e5\" (UID: \"9669ea77-a286-4fe0-8a3f-26653ca161e5\") " Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.498173 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9669ea77-a286-4fe0-8a3f-26653ca161e5-kube-api-access-cvtlq" (OuterVolumeSpecName: "kube-api-access-cvtlq") pod "9669ea77-a286-4fe0-8a3f-26653ca161e5" (UID: "9669ea77-a286-4fe0-8a3f-26653ca161e5"). InnerVolumeSpecName "kube-api-access-cvtlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.589628 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvtlq\" (UniqueName: \"kubernetes.io/projected/9669ea77-a286-4fe0-8a3f-26653ca161e5-kube-api-access-cvtlq\") on node \"crc\" DevicePath \"\"" Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.622115 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9669ea77-a286-4fe0-8a3f-26653ca161e5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9669ea77-a286-4fe0-8a3f-26653ca161e5" (UID: "9669ea77-a286-4fe0-8a3f-26653ca161e5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:30:38 crc kubenswrapper[4740]: I1009 11:30:38.692763 4740 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9669ea77-a286-4fe0-8a3f-26653ca161e5-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 11:30:39 crc kubenswrapper[4740]: I1009 11:30:39.328032 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mq5nd_must-gather-kszhj_9669ea77-a286-4fe0-8a3f-26653ca161e5/copy/0.log" Oct 09 11:30:39 crc kubenswrapper[4740]: I1009 11:30:39.328412 4740 scope.go:117] "RemoveContainer" containerID="228aae2a2727c17fc5a8fcee31820460e04dc56143f6e736396da3e1663b9a4a" Oct 09 11:30:39 crc kubenswrapper[4740]: I1009 11:30:39.328451 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mq5nd/must-gather-kszhj" Oct 09 11:30:39 crc kubenswrapper[4740]: I1009 11:30:39.360095 4740 scope.go:117] "RemoveContainer" containerID="56c4518f771884721960d702faf5b34d5899cb6efcd70ffa8f72aef91632b1e1" Oct 09 11:30:39 crc kubenswrapper[4740]: I1009 11:30:39.765428 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9669ea77-a286-4fe0-8a3f-26653ca161e5" path="/var/lib/kubelet/pods/9669ea77-a286-4fe0-8a3f-26653ca161e5/volumes" Oct 09 11:30:46 crc kubenswrapper[4740]: I1009 11:30:46.753712 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:30:46 crc kubenswrapper[4740]: E1009 11:30:46.755574 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:30:57 crc kubenswrapper[4740]: I1009 11:30:57.754702 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:30:57 crc kubenswrapper[4740]: E1009 11:30:57.755360 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:31:09 crc kubenswrapper[4740]: I1009 11:31:09.754948 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:31:09 crc kubenswrapper[4740]: E1009 11:31:09.755828 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.319479 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6swms/must-gather-bqb9l"] Oct 09 11:31:14 crc kubenswrapper[4740]: E1009 11:31:14.320388 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d63a132-0bdd-4fe3-bab0-af9accc67261" containerName="collect-profiles" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320401 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d63a132-0bdd-4fe3-bab0-af9accc67261" containerName="collect-profiles" Oct 09 11:31:14 crc kubenswrapper[4740]: E1009 11:31:14.320427 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerName="gather" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320433 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerName="gather" Oct 09 11:31:14 crc kubenswrapper[4740]: E1009 11:31:14.320445 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerName="copy" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320451 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerName="copy" Oct 09 11:31:14 crc kubenswrapper[4740]: E1009 11:31:14.320465 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerName="registry-server" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320470 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerName="registry-server" Oct 09 11:31:14 crc kubenswrapper[4740]: E1009 11:31:14.320481 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerName="extract-content" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320486 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerName="extract-content" Oct 09 11:31:14 crc kubenswrapper[4740]: E1009 11:31:14.320507 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerName="extract-utilities" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320512 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerName="extract-utilities" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320688 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d63a132-0bdd-4fe3-bab0-af9accc67261" containerName="collect-profiles" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320697 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerName="copy" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320712 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9669ea77-a286-4fe0-8a3f-26653ca161e5" containerName="gather" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.320724 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6c75db-559d-4548-9d5c-db41f9342e9d" containerName="registry-server" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.321732 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.323607 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6swms"/"kube-root-ca.crt" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.323605 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6swms"/"openshift-service-ca.crt" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.331014 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6swms/must-gather-bqb9l"] Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.345533 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5ntr\" (UniqueName: \"kubernetes.io/projected/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-kube-api-access-m5ntr\") pod \"must-gather-bqb9l\" (UID: \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\") " pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.345680 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-must-gather-output\") pod \"must-gather-bqb9l\" (UID: \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\") " pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.447229 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5ntr\" (UniqueName: \"kubernetes.io/projected/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-kube-api-access-m5ntr\") pod \"must-gather-bqb9l\" (UID: \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\") " pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.447527 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-must-gather-output\") pod \"must-gather-bqb9l\" (UID: \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\") " pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.448011 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-must-gather-output\") pod \"must-gather-bqb9l\" (UID: \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\") " pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.472378 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5ntr\" (UniqueName: \"kubernetes.io/projected/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-kube-api-access-m5ntr\") pod \"must-gather-bqb9l\" (UID: \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\") " pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:31:14 crc kubenswrapper[4740]: I1009 11:31:14.647873 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:31:15 crc kubenswrapper[4740]: I1009 11:31:15.104235 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6swms/must-gather-bqb9l"] Oct 09 11:31:15 crc kubenswrapper[4740]: I1009 11:31:15.695066 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/must-gather-bqb9l" event={"ID":"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2","Type":"ContainerStarted","Data":"5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4"} Oct 09 11:31:15 crc kubenswrapper[4740]: I1009 11:31:15.695703 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/must-gather-bqb9l" event={"ID":"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2","Type":"ContainerStarted","Data":"0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3"} Oct 09 11:31:15 crc kubenswrapper[4740]: I1009 11:31:15.695719 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/must-gather-bqb9l" event={"ID":"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2","Type":"ContainerStarted","Data":"08fdadec26e161189a9b9dedb37c1fc8e60a5b90395652dc66e0295e09702cf7"} Oct 09 11:31:15 crc kubenswrapper[4740]: I1009 11:31:15.717702 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6swms/must-gather-bqb9l" podStartSLOduration=1.717682009 podStartE2EDuration="1.717682009s" podCreationTimestamp="2025-10-09 11:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 11:31:15.717419022 +0000 UTC m=+3814.679619403" watchObservedRunningTime="2025-10-09 11:31:15.717682009 +0000 UTC m=+3814.679882390" Oct 09 11:31:18 crc kubenswrapper[4740]: I1009 11:31:18.979242 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6swms/crc-debug-8fgzd"] Oct 09 11:31:18 crc kubenswrapper[4740]: I1009 11:31:18.980631 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:18 crc kubenswrapper[4740]: I1009 11:31:18.982799 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6swms"/"default-dockercfg-p6sgg" Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.154367 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g85x8\" (UniqueName: \"kubernetes.io/projected/f8e2c865-5275-4ece-9348-0120bcaff4d0-kube-api-access-g85x8\") pod \"crc-debug-8fgzd\" (UID: \"f8e2c865-5275-4ece-9348-0120bcaff4d0\") " pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.154833 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8e2c865-5275-4ece-9348-0120bcaff4d0-host\") pod \"crc-debug-8fgzd\" (UID: \"f8e2c865-5275-4ece-9348-0120bcaff4d0\") " pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.257488 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g85x8\" (UniqueName: \"kubernetes.io/projected/f8e2c865-5275-4ece-9348-0120bcaff4d0-kube-api-access-g85x8\") pod \"crc-debug-8fgzd\" (UID: \"f8e2c865-5275-4ece-9348-0120bcaff4d0\") " pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.257590 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8e2c865-5275-4ece-9348-0120bcaff4d0-host\") pod \"crc-debug-8fgzd\" (UID: \"f8e2c865-5275-4ece-9348-0120bcaff4d0\") " pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.257710 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8e2c865-5275-4ece-9348-0120bcaff4d0-host\") pod \"crc-debug-8fgzd\" (UID: \"f8e2c865-5275-4ece-9348-0120bcaff4d0\") " pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.280842 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g85x8\" (UniqueName: \"kubernetes.io/projected/f8e2c865-5275-4ece-9348-0120bcaff4d0-kube-api-access-g85x8\") pod \"crc-debug-8fgzd\" (UID: \"f8e2c865-5275-4ece-9348-0120bcaff4d0\") " pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.298456 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:19 crc kubenswrapper[4740]: W1009 11:31:19.327670 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e2c865_5275_4ece_9348_0120bcaff4d0.slice/crio-6ae39fa6913988502f0b2ee86653016e79ee333503e653fe851f0c3bcc2795c5 WatchSource:0}: Error finding container 6ae39fa6913988502f0b2ee86653016e79ee333503e653fe851f0c3bcc2795c5: Status 404 returned error can't find the container with id 6ae39fa6913988502f0b2ee86653016e79ee333503e653fe851f0c3bcc2795c5 Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.729609 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/crc-debug-8fgzd" event={"ID":"f8e2c865-5275-4ece-9348-0120bcaff4d0","Type":"ContainerStarted","Data":"85bbf8cd270c1a67fbe788fbcb03b9c6e6866544ce8a5b507107ad6ee9738d2b"} Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.730217 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/crc-debug-8fgzd" event={"ID":"f8e2c865-5275-4ece-9348-0120bcaff4d0","Type":"ContainerStarted","Data":"6ae39fa6913988502f0b2ee86653016e79ee333503e653fe851f0c3bcc2795c5"} Oct 09 11:31:19 crc kubenswrapper[4740]: I1009 11:31:19.760427 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6swms/crc-debug-8fgzd" podStartSLOduration=1.7604108200000002 podStartE2EDuration="1.76041082s" podCreationTimestamp="2025-10-09 11:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 11:31:19.75631032 +0000 UTC m=+3818.718510711" watchObservedRunningTime="2025-10-09 11:31:19.76041082 +0000 UTC m=+3818.722611201" Oct 09 11:31:23 crc kubenswrapper[4740]: I1009 11:31:23.753395 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:31:23 crc kubenswrapper[4740]: E1009 11:31:23.754158 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:31:36 crc kubenswrapper[4740]: I1009 11:31:36.753697 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:31:36 crc kubenswrapper[4740]: E1009 11:31:36.754489 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:31:51 crc kubenswrapper[4740]: I1009 11:31:51.765703 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:31:51 crc kubenswrapper[4740]: E1009 11:31:51.767252 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:31:56 crc kubenswrapper[4740]: I1009 11:31:56.052688 4740 generic.go:334] "Generic (PLEG): container finished" podID="f8e2c865-5275-4ece-9348-0120bcaff4d0" containerID="85bbf8cd270c1a67fbe788fbcb03b9c6e6866544ce8a5b507107ad6ee9738d2b" exitCode=0 Oct 09 11:31:56 crc kubenswrapper[4740]: I1009 11:31:56.052794 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/crc-debug-8fgzd" event={"ID":"f8e2c865-5275-4ece-9348-0120bcaff4d0","Type":"ContainerDied","Data":"85bbf8cd270c1a67fbe788fbcb03b9c6e6866544ce8a5b507107ad6ee9738d2b"} Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.206811 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.246103 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6swms/crc-debug-8fgzd"] Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.263575 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6swms/crc-debug-8fgzd"] Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.265775 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8e2c865-5275-4ece-9348-0120bcaff4d0-host\") pod \"f8e2c865-5275-4ece-9348-0120bcaff4d0\" (UID: \"f8e2c865-5275-4ece-9348-0120bcaff4d0\") " Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.265911 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g85x8\" (UniqueName: \"kubernetes.io/projected/f8e2c865-5275-4ece-9348-0120bcaff4d0-kube-api-access-g85x8\") pod \"f8e2c865-5275-4ece-9348-0120bcaff4d0\" (UID: \"f8e2c865-5275-4ece-9348-0120bcaff4d0\") " Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.265994 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8e2c865-5275-4ece-9348-0120bcaff4d0-host" (OuterVolumeSpecName: "host") pod "f8e2c865-5275-4ece-9348-0120bcaff4d0" (UID: "f8e2c865-5275-4ece-9348-0120bcaff4d0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.266389 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8e2c865-5275-4ece-9348-0120bcaff4d0-host\") on node \"crc\" DevicePath \"\"" Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.271106 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e2c865-5275-4ece-9348-0120bcaff4d0-kube-api-access-g85x8" (OuterVolumeSpecName: "kube-api-access-g85x8") pod "f8e2c865-5275-4ece-9348-0120bcaff4d0" (UID: "f8e2c865-5275-4ece-9348-0120bcaff4d0"). InnerVolumeSpecName "kube-api-access-g85x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.367826 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g85x8\" (UniqueName: \"kubernetes.io/projected/f8e2c865-5275-4ece-9348-0120bcaff4d0-kube-api-access-g85x8\") on node \"crc\" DevicePath \"\"" Oct 09 11:31:57 crc kubenswrapper[4740]: I1009 11:31:57.765639 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e2c865-5275-4ece-9348-0120bcaff4d0" path="/var/lib/kubelet/pods/f8e2c865-5275-4ece-9348-0120bcaff4d0/volumes" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.076226 4740 scope.go:117] "RemoveContainer" containerID="85bbf8cd270c1a67fbe788fbcb03b9c6e6866544ce8a5b507107ad6ee9738d2b" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.076340 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-8fgzd" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.433394 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6swms/crc-debug-h8596"] Oct 09 11:31:58 crc kubenswrapper[4740]: E1009 11:31:58.433805 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e2c865-5275-4ece-9348-0120bcaff4d0" containerName="container-00" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.433822 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e2c865-5275-4ece-9348-0120bcaff4d0" containerName="container-00" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.434094 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e2c865-5275-4ece-9348-0120bcaff4d0" containerName="container-00" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.434796 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.436844 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6swms"/"default-dockercfg-p6sgg" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.487587 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2745a08e-fde6-4232-bee5-5c15243a550a-host\") pod \"crc-debug-h8596\" (UID: \"2745a08e-fde6-4232-bee5-5c15243a550a\") " pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.487737 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwfd5\" (UniqueName: \"kubernetes.io/projected/2745a08e-fde6-4232-bee5-5c15243a550a-kube-api-access-hwfd5\") pod \"crc-debug-h8596\" (UID: \"2745a08e-fde6-4232-bee5-5c15243a550a\") " pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.590126 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwfd5\" (UniqueName: \"kubernetes.io/projected/2745a08e-fde6-4232-bee5-5c15243a550a-kube-api-access-hwfd5\") pod \"crc-debug-h8596\" (UID: \"2745a08e-fde6-4232-bee5-5c15243a550a\") " pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.590265 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2745a08e-fde6-4232-bee5-5c15243a550a-host\") pod \"crc-debug-h8596\" (UID: \"2745a08e-fde6-4232-bee5-5c15243a550a\") " pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.590426 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2745a08e-fde6-4232-bee5-5c15243a550a-host\") pod \"crc-debug-h8596\" (UID: \"2745a08e-fde6-4232-bee5-5c15243a550a\") " pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.609389 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwfd5\" (UniqueName: \"kubernetes.io/projected/2745a08e-fde6-4232-bee5-5c15243a550a-kube-api-access-hwfd5\") pod \"crc-debug-h8596\" (UID: \"2745a08e-fde6-4232-bee5-5c15243a550a\") " pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:31:58 crc kubenswrapper[4740]: I1009 11:31:58.752562 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:31:59 crc kubenswrapper[4740]: I1009 11:31:59.091447 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/crc-debug-h8596" event={"ID":"2745a08e-fde6-4232-bee5-5c15243a550a","Type":"ContainerStarted","Data":"b581c3cb3a691c9b8816ae447ac90b4a07d13ab7a2919d802365dcad6cbfb673"} Oct 09 11:32:00 crc kubenswrapper[4740]: I1009 11:32:00.101494 4740 generic.go:334] "Generic (PLEG): container finished" podID="2745a08e-fde6-4232-bee5-5c15243a550a" containerID="286615691eb0715cc683cb49281092bf73d3d2229b89e2d51a857f6e967da040" exitCode=0 Oct 09 11:32:00 crc kubenswrapper[4740]: I1009 11:32:00.101535 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/crc-debug-h8596" event={"ID":"2745a08e-fde6-4232-bee5-5c15243a550a","Type":"ContainerDied","Data":"286615691eb0715cc683cb49281092bf73d3d2229b89e2d51a857f6e967da040"} Oct 09 11:32:00 crc kubenswrapper[4740]: I1009 11:32:00.533799 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6swms/crc-debug-h8596"] Oct 09 11:32:00 crc kubenswrapper[4740]: I1009 11:32:00.545547 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6swms/crc-debug-h8596"] Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.233144 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.338586 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2745a08e-fde6-4232-bee5-5c15243a550a-host\") pod \"2745a08e-fde6-4232-bee5-5c15243a550a\" (UID: \"2745a08e-fde6-4232-bee5-5c15243a550a\") " Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.338679 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwfd5\" (UniqueName: \"kubernetes.io/projected/2745a08e-fde6-4232-bee5-5c15243a550a-kube-api-access-hwfd5\") pod \"2745a08e-fde6-4232-bee5-5c15243a550a\" (UID: \"2745a08e-fde6-4232-bee5-5c15243a550a\") " Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.338701 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2745a08e-fde6-4232-bee5-5c15243a550a-host" (OuterVolumeSpecName: "host") pod "2745a08e-fde6-4232-bee5-5c15243a550a" (UID: "2745a08e-fde6-4232-bee5-5c15243a550a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.339123 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2745a08e-fde6-4232-bee5-5c15243a550a-host\") on node \"crc\" DevicePath \"\"" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.344926 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2745a08e-fde6-4232-bee5-5c15243a550a-kube-api-access-hwfd5" (OuterVolumeSpecName: "kube-api-access-hwfd5") pod "2745a08e-fde6-4232-bee5-5c15243a550a" (UID: "2745a08e-fde6-4232-bee5-5c15243a550a"). InnerVolumeSpecName "kube-api-access-hwfd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.441303 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwfd5\" (UniqueName: \"kubernetes.io/projected/2745a08e-fde6-4232-bee5-5c15243a550a-kube-api-access-hwfd5\") on node \"crc\" DevicePath \"\"" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.689168 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6swms/crc-debug-fcscm"] Oct 09 11:32:01 crc kubenswrapper[4740]: E1009 11:32:01.689625 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2745a08e-fde6-4232-bee5-5c15243a550a" containerName="container-00" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.689644 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2745a08e-fde6-4232-bee5-5c15243a550a" containerName="container-00" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.689897 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2745a08e-fde6-4232-bee5-5c15243a550a" containerName="container-00" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.690577 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.746507 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-host\") pod \"crc-debug-fcscm\" (UID: \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\") " pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.746626 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rq49\" (UniqueName: \"kubernetes.io/projected/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-kube-api-access-6rq49\") pod \"crc-debug-fcscm\" (UID: \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\") " pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.766834 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2745a08e-fde6-4232-bee5-5c15243a550a" path="/var/lib/kubelet/pods/2745a08e-fde6-4232-bee5-5c15243a550a/volumes" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.848925 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rq49\" (UniqueName: \"kubernetes.io/projected/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-kube-api-access-6rq49\") pod \"crc-debug-fcscm\" (UID: \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\") " pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.849123 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-host\") pod \"crc-debug-fcscm\" (UID: \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\") " pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.849333 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-host\") pod \"crc-debug-fcscm\" (UID: \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\") " pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:01 crc kubenswrapper[4740]: I1009 11:32:01.868838 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rq49\" (UniqueName: \"kubernetes.io/projected/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-kube-api-access-6rq49\") pod \"crc-debug-fcscm\" (UID: \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\") " pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:02 crc kubenswrapper[4740]: I1009 11:32:02.008081 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:02 crc kubenswrapper[4740]: W1009 11:32:02.053735 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e8ed1fb_af00_41f3_87ed_5083c12b72e2.slice/crio-4158929331dc23573af2151c986d51e9212b178108458371727f6f9612f25cb0 WatchSource:0}: Error finding container 4158929331dc23573af2151c986d51e9212b178108458371727f6f9612f25cb0: Status 404 returned error can't find the container with id 4158929331dc23573af2151c986d51e9212b178108458371727f6f9612f25cb0 Oct 09 11:32:02 crc kubenswrapper[4740]: I1009 11:32:02.121218 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/crc-debug-fcscm" event={"ID":"3e8ed1fb-af00-41f3-87ed-5083c12b72e2","Type":"ContainerStarted","Data":"4158929331dc23573af2151c986d51e9212b178108458371727f6f9612f25cb0"} Oct 09 11:32:02 crc kubenswrapper[4740]: I1009 11:32:02.131160 4740 scope.go:117] "RemoveContainer" containerID="286615691eb0715cc683cb49281092bf73d3d2229b89e2d51a857f6e967da040" Oct 09 11:32:02 crc kubenswrapper[4740]: I1009 11:32:02.131341 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-h8596" Oct 09 11:32:03 crc kubenswrapper[4740]: I1009 11:32:03.139347 4740 generic.go:334] "Generic (PLEG): container finished" podID="3e8ed1fb-af00-41f3-87ed-5083c12b72e2" containerID="cf11843c7be51f99c87c9045340cdda83d620640fd70e3bb0155163a682c093c" exitCode=0 Oct 09 11:32:03 crc kubenswrapper[4740]: I1009 11:32:03.139463 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/crc-debug-fcscm" event={"ID":"3e8ed1fb-af00-41f3-87ed-5083c12b72e2","Type":"ContainerDied","Data":"cf11843c7be51f99c87c9045340cdda83d620640fd70e3bb0155163a682c093c"} Oct 09 11:32:03 crc kubenswrapper[4740]: I1009 11:32:03.179440 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6swms/crc-debug-fcscm"] Oct 09 11:32:03 crc kubenswrapper[4740]: I1009 11:32:03.187080 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6swms/crc-debug-fcscm"] Oct 09 11:32:04 crc kubenswrapper[4740]: I1009 11:32:04.250888 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:04 crc kubenswrapper[4740]: I1009 11:32:04.294618 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rq49\" (UniqueName: \"kubernetes.io/projected/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-kube-api-access-6rq49\") pod \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\" (UID: \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\") " Oct 09 11:32:04 crc kubenswrapper[4740]: I1009 11:32:04.294777 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-host\") pod \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\" (UID: \"3e8ed1fb-af00-41f3-87ed-5083c12b72e2\") " Oct 09 11:32:04 crc kubenswrapper[4740]: I1009 11:32:04.294839 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-host" (OuterVolumeSpecName: "host") pod "3e8ed1fb-af00-41f3-87ed-5083c12b72e2" (UID: "3e8ed1fb-af00-41f3-87ed-5083c12b72e2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 11:32:04 crc kubenswrapper[4740]: I1009 11:32:04.295128 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-host\") on node \"crc\" DevicePath \"\"" Oct 09 11:32:04 crc kubenswrapper[4740]: I1009 11:32:04.299945 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-kube-api-access-6rq49" (OuterVolumeSpecName: "kube-api-access-6rq49") pod "3e8ed1fb-af00-41f3-87ed-5083c12b72e2" (UID: "3e8ed1fb-af00-41f3-87ed-5083c12b72e2"). InnerVolumeSpecName "kube-api-access-6rq49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:32:04 crc kubenswrapper[4740]: I1009 11:32:04.396811 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rq49\" (UniqueName: \"kubernetes.io/projected/3e8ed1fb-af00-41f3-87ed-5083c12b72e2-kube-api-access-6rq49\") on node \"crc\" DevicePath \"\"" Oct 09 11:32:04 crc kubenswrapper[4740]: I1009 11:32:04.753835 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:32:04 crc kubenswrapper[4740]: E1009 11:32:04.754180 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:32:05 crc kubenswrapper[4740]: I1009 11:32:05.158127 4740 scope.go:117] "RemoveContainer" containerID="cf11843c7be51f99c87c9045340cdda83d620640fd70e3bb0155163a682c093c" Oct 09 11:32:05 crc kubenswrapper[4740]: I1009 11:32:05.158568 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/crc-debug-fcscm" Oct 09 11:32:05 crc kubenswrapper[4740]: I1009 11:32:05.763830 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8ed1fb-af00-41f3-87ed-5083c12b72e2" path="/var/lib/kubelet/pods/3e8ed1fb-af00-41f3-87ed-5083c12b72e2/volumes" Oct 09 11:32:13 crc kubenswrapper[4740]: I1009 11:32:13.661454 4740 scope.go:117] "RemoveContainer" containerID="552c84fb0547b9edd0702a862120930ffca8694d14aa5c3a7166d7e2bc55172a" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.091047 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-544757df48-b9dz7_dc9b7872-3887-45a9-8405-506862479e3f/barbican-api/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.144865 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-544757df48-b9dz7_dc9b7872-3887-45a9-8405-506862479e3f/barbican-api-log/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.294399 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bd7855d54-lgqzd_72bcd07c-fbd9-44cb-8295-ba498f012009/barbican-keystone-listener/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.335520 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bd7855d54-lgqzd_72bcd07c-fbd9-44cb-8295-ba498f012009/barbican-keystone-listener-log/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.448268 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-59d79d879-w5c9m_8db36903-c2ef-429f-97dd-46e98c2a061b/barbican-worker/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.498035 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-59d79d879-w5c9m_8db36903-c2ef-429f-97dd-46e98c2a061b/barbican-worker-log/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.611164 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7t9x8_3abd5479-bb5c-4f2b-bda4-0aa1c28bd1b8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.715515 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5eefd278-fab1-4acc-acca-b6474799e6d1/ceilometer-central-agent/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.753347 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:32:17 crc kubenswrapper[4740]: E1009 11:32:17.753603 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.760292 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5eefd278-fab1-4acc-acca-b6474799e6d1/ceilometer-notification-agent/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.800709 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5eefd278-fab1-4acc-acca-b6474799e6d1/proxy-httpd/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.896099 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5eefd278-fab1-4acc-acca-b6474799e6d1/sg-core/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.981117 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_56b8e346-13ed-4f64-88af-13be77ceddfa/cinder-api/0.log" Oct 09 11:32:17 crc kubenswrapper[4740]: I1009 11:32:17.986462 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_56b8e346-13ed-4f64-88af-13be77ceddfa/cinder-api-log/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.139859 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c/cinder-scheduler/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.205163 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f22418d4-a6c6-4e33-a96e-3e8c2d4a5e1c/probe/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.370219 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-wndn5_091c1607-1916-4dfd-9e3d-95dbe5534e98/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.455660 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cr2p7_d13c7792-e2d1-4ce2-b965-f77bd77b0cd0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.568054 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dxsxr_6b62efe3-f320-4b06-9b4f-6cdebea2c83c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.672859 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-sq7qc_a952fe70-b037-4995-a678-b3da7312dcee/init/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.839088 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-sq7qc_a952fe70-b037-4995-a678-b3da7312dcee/init/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.944456 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-s79wz_7ced3562-d429-4443-9aa2-82901f4f7797/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:18 crc kubenswrapper[4740]: I1009 11:32:18.953503 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-sq7qc_a952fe70-b037-4995-a678-b3da7312dcee/dnsmasq-dns/0.log" Oct 09 11:32:19 crc kubenswrapper[4740]: I1009 11:32:19.103021 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_83ecd586-6121-4e74-91f1-87267432cc2d/glance-log/0.log" Oct 09 11:32:19 crc kubenswrapper[4740]: I1009 11:32:19.163853 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_83ecd586-6121-4e74-91f1-87267432cc2d/glance-httpd/0.log" Oct 09 11:32:19 crc kubenswrapper[4740]: I1009 11:32:19.306463 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5eae963c-dabb-4da9-ac57-86a621088e55/glance-log/0.log" Oct 09 11:32:19 crc kubenswrapper[4740]: I1009 11:32:19.321130 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5eae963c-dabb-4da9-ac57-86a621088e55/glance-httpd/0.log" Oct 09 11:32:19 crc kubenswrapper[4740]: I1009 11:32:19.489516 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dd4b95776-lcxbt_3762ae93-7451-4d99-aad4-f9c68666cf40/horizon/0.log" Oct 09 11:32:19 crc kubenswrapper[4740]: I1009 11:32:19.620778 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-h6knn_d611b217-c3b5-49dd-9a5f-acd64171310d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:19 crc kubenswrapper[4740]: I1009 11:32:19.821935 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dd4b95776-lcxbt_3762ae93-7451-4d99-aad4-f9c68666cf40/horizon-log/0.log" Oct 09 11:32:19 crc kubenswrapper[4740]: I1009 11:32:19.849929 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-8fhb7_52c814fd-0700-4e3e-8302-19324617f7c5/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:20 crc kubenswrapper[4740]: I1009 11:32:20.000379 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29333461-s8498_56c17d1b-f3e0-4ca7-ad1f-ac1314036f59/keystone-cron/0.log" Oct 09 11:32:20 crc kubenswrapper[4740]: I1009 11:32:20.151326 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5db569f5cf-ksc2p_d2497d66-a643-4eb4-b69d-725db422cb3a/keystone-api/0.log" Oct 09 11:32:20 crc kubenswrapper[4740]: I1009 11:32:20.214907 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_36a3a627-eea7-4034-a615-38c388851e07/kube-state-metrics/0.log" Oct 09 11:32:20 crc kubenswrapper[4740]: I1009 11:32:20.384618 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nrg72_55748bea-018d-4297-8939-ffec480b42ba/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:20 crc kubenswrapper[4740]: I1009 11:32:20.634117 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d74f6589-zvlln_94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19/neutron-httpd/0.log" Oct 09 11:32:20 crc kubenswrapper[4740]: I1009 11:32:20.661017 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d74f6589-zvlln_94af5dc9-531f-4eb0-bc3f-7f21b6b7fb19/neutron-api/0.log" Oct 09 11:32:20 crc kubenswrapper[4740]: I1009 11:32:20.703485 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gjdvj_cdb17de8-f861-4899-8e4d-455cd554cf43/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:21 crc kubenswrapper[4740]: I1009 11:32:21.173618 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_260e0d21-3655-4a0d-a51e-6c483e20c7f5/nova-api-log/0.log" Oct 09 11:32:21 crc kubenswrapper[4740]: I1009 11:32:21.341799 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_305a12c3-450d-43fc-87bb-9bb293438451/nova-cell0-conductor-conductor/0.log" Oct 09 11:32:21 crc kubenswrapper[4740]: I1009 11:32:21.597971 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c11f736a-8dcf-45d6-9f8d-7ff8866458fb/nova-cell1-conductor-conductor/0.log" Oct 09 11:32:21 crc kubenswrapper[4740]: I1009 11:32:21.662102 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_260e0d21-3655-4a0d-a51e-6c483e20c7f5/nova-api-api/0.log" Oct 09 11:32:21 crc kubenswrapper[4740]: I1009 11:32:21.673128 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5e2d4c31-bba4-46d5-8119-b0970e10437d/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 11:32:21 crc kubenswrapper[4740]: I1009 11:32:21.860714 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-frpjr_914ecc49-bc7d-4c7d-b2f7-1fd9fbb42207/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:21 crc kubenswrapper[4740]: I1009 11:32:21.932989 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df3b597d-a996-4ebd-b896-61c6c62a0145/nova-metadata-log/0.log" Oct 09 11:32:22 crc kubenswrapper[4740]: I1009 11:32:22.288585 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dcd9d52b-8167-47f9-8c36-b75f88119ad5/mysql-bootstrap/0.log" Oct 09 11:32:22 crc kubenswrapper[4740]: I1009 11:32:22.318481 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3bb674db-a6fb-4100-82d2-2fae6660902b/nova-scheduler-scheduler/0.log" Oct 09 11:32:22 crc kubenswrapper[4740]: I1009 11:32:22.521107 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dcd9d52b-8167-47f9-8c36-b75f88119ad5/mysql-bootstrap/0.log" Oct 09 11:32:22 crc kubenswrapper[4740]: I1009 11:32:22.587383 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dcd9d52b-8167-47f9-8c36-b75f88119ad5/galera/0.log" Oct 09 11:32:22 crc kubenswrapper[4740]: I1009 11:32:22.746123 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3dce8908-af4b-4596-bed2-02788a615207/mysql-bootstrap/0.log" Oct 09 11:32:22 crc kubenswrapper[4740]: I1009 11:32:22.915017 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3dce8908-af4b-4596-bed2-02788a615207/mysql-bootstrap/0.log" Oct 09 11:32:22 crc kubenswrapper[4740]: I1009 11:32:22.977681 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3dce8908-af4b-4596-bed2-02788a615207/galera/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.101576 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_efa49127-ef96-4ed6-8b72-c106e5575707/openstackclient/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.182993 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df3b597d-a996-4ebd-b896-61c6c62a0145/nova-metadata-metadata/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.201017 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-c6rld_7f56ff38-de3a-4c48-8fc0-43e0eac26c55/ovn-controller/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.365568 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pn5fj_94fc74ef-6b90-4c9a-9da5-d7eb116a7806/openstack-network-exporter/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.425347 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cwdss_5a1841e0-a15d-4dca-a1a4-6b50f338ddbc/ovsdb-server-init/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.666944 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cwdss_5a1841e0-a15d-4dca-a1a4-6b50f338ddbc/ovsdb-server/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.671850 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cwdss_5a1841e0-a15d-4dca-a1a4-6b50f338ddbc/ovs-vswitchd/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.713111 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cwdss_5a1841e0-a15d-4dca-a1a4-6b50f338ddbc/ovsdb-server-init/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.902449 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hd9p5_72061500-62b1-404d-8def-280fcca2e73f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.906956 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eda487bd-e994-4fce-86f9-50e85aaf30b2/openstack-network-exporter/0.log" Oct 09 11:32:23 crc kubenswrapper[4740]: I1009 11:32:23.979652 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eda487bd-e994-4fce-86f9-50e85aaf30b2/ovn-northd/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.080098 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0c37175d-6801-461a-82ba-ea611afdaebf/openstack-network-exporter/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.168093 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0c37175d-6801-461a-82ba-ea611afdaebf/ovsdbserver-nb/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.276807 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_52b13ae3-8184-4ea2-a6b5-14d739b1200e/openstack-network-exporter/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.387538 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_52b13ae3-8184-4ea2-a6b5-14d739b1200e/ovsdbserver-sb/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.586008 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55c77867db-hsc8q_48968716-1198-429f-90f0-ab6663baaed5/placement-api/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.619395 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55c77867db-hsc8q_48968716-1198-429f-90f0-ab6663baaed5/placement-log/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.673517 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46/setup-container/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.816730 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46/setup-container/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.903116 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ff4b6585-91c6-48f8-ba40-5cd075c7c59e/setup-container/0.log" Oct 09 11:32:24 crc kubenswrapper[4740]: I1009 11:32:24.933075 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8ef6dccc-d229-4bb8-8fb2-5c6f859ecb46/rabbitmq/0.log" Oct 09 11:32:25 crc kubenswrapper[4740]: I1009 11:32:25.094565 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ff4b6585-91c6-48f8-ba40-5cd075c7c59e/setup-container/0.log" Oct 09 11:32:25 crc kubenswrapper[4740]: I1009 11:32:25.183831 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qz5cg_3f30a224-f5af-498e-97f3-28a5a26f9884/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:25 crc kubenswrapper[4740]: I1009 11:32:25.202329 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ff4b6585-91c6-48f8-ba40-5cd075c7c59e/rabbitmq/0.log" Oct 09 11:32:25 crc kubenswrapper[4740]: I1009 11:32:25.372578 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4jjwr_1bb88dfa-ffc6-433a-9df2-f00e2a6805e7/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:25 crc kubenswrapper[4740]: I1009 11:32:25.459182 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6t89s_945008af-c262-4581-8f40-51b8fe5a9dd8/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:25 crc kubenswrapper[4740]: I1009 11:32:25.607862 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-w8hc6_c79d4035-1be0-44ff-9ddd-0a65a54be7ed/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:25 crc kubenswrapper[4740]: I1009 11:32:25.684311 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9vbmb_0cc7d46b-528d-415b-a1cf-34ea3e4483b5/ssh-known-hosts-edpm-deployment/0.log" Oct 09 11:32:25 crc kubenswrapper[4740]: I1009 11:32:25.892908 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66569d88ff-tjljh_501b9024-4f9f-41eb-ae73-d9ecb0637363/proxy-server/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.024794 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66569d88ff-tjljh_501b9024-4f9f-41eb-ae73-d9ecb0637363/proxy-httpd/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.051492 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vzk5q_ebeb8396-40be-4400-8a2f-d1cdeb8c20e4/swift-ring-rebalance/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.120052 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/account-auditor/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.232048 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/account-reaper/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.258219 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/account-replicator/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.329726 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/account-server/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.421028 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/container-auditor/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.489484 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/container-server/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.536640 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/container-replicator/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.617575 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/container-updater/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.722401 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-expirer/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.724530 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-auditor/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.738565 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-replicator/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.804672 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-server/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.916444 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/rsync/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.929467 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/object-updater/0.log" Oct 09 11:32:26 crc kubenswrapper[4740]: I1009 11:32:26.966740 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a11218-32c1-4b40-a738-f56e795904d7/swift-recon-cron/0.log" Oct 09 11:32:27 crc kubenswrapper[4740]: I1009 11:32:27.323313 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-ggkft_40e8133a-5380-4983-a96f-8f28d50108a9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:27 crc kubenswrapper[4740]: I1009 11:32:27.343264 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4c1a2aba-0872-4bef-9bad-0ba37788423d/tempest-tests-tempest-tests-runner/0.log" Oct 09 11:32:27 crc kubenswrapper[4740]: I1009 11:32:27.468452 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0486d8ef-3b23-4d76-9764-a3d48c174482/test-operator-logs-container/0.log" Oct 09 11:32:27 crc kubenswrapper[4740]: I1009 11:32:27.576137 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-t2ngq_19338c28-ee36-4273-8f74-f34767a3fcb1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 11:32:29 crc kubenswrapper[4740]: I1009 11:32:29.753966 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:32:29 crc kubenswrapper[4740]: E1009 11:32:29.754415 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kdjch_openshift-machine-config-operator(223b849a-db98-4f56-a649-9e144189950a)\"" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" Oct 09 11:32:38 crc kubenswrapper[4740]: I1009 11:32:38.058423 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f16fec59-73b1-4b57-ab47-c1767c6c2a7d/memcached/0.log" Oct 09 11:32:41 crc kubenswrapper[4740]: I1009 11:32:41.753238 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930" Oct 09 11:32:42 crc kubenswrapper[4740]: I1009 11:32:42.499170 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"ae51d440167983a5287ca7bb6e99bb81cf70c62b4ac5f857d15a24ed78f2d8d3"} Oct 09 11:32:50 crc kubenswrapper[4740]: I1009 11:32:50.885518 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-jqnnv_93f4faa8-4d5e-48d9-ac5a-bb1468f972d3/kube-rbac-proxy/0.log" Oct 09 11:32:50 crc kubenswrapper[4740]: I1009 11:32:50.915616 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-jqnnv_93f4faa8-4d5e-48d9-ac5a-bb1468f972d3/manager/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.114982 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-mp2p4_95b86671-972c-4a57-b68b-0421b82bd3d4/manager/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.124649 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-mp2p4_95b86671-972c-4a57-b68b-0421b82bd3d4/kube-rbac-proxy/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.256698 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/util/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.412917 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/util/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.444875 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/pull/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.472101 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/pull/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.670719 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/pull/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.690474 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/extract/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.737841 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d62c0b0d7057e9a8f77922a9302da78aee2de61172b9d08470860540bebhh44_d1d10dcf-922d-4d14-ac25-0b8482757670/util/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.833804 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-w2ftw_2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8/kube-rbac-proxy/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.933747 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-w2ftw_2b3fb0c8-988f-4ed4-86e0-77db8e5e06a8/manager/0.log" Oct 09 11:32:51 crc kubenswrapper[4740]: I1009 11:32:51.957856 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-268g9_a3c08e43-cc8b-433e-ba8e-fd225eef09ed/kube-rbac-proxy/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.135536 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-268g9_a3c08e43-cc8b-433e-ba8e-fd225eef09ed/manager/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.149995 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-sf7cf_1519e3af-34c9-4722-9aaa-8a10ef0d49de/kube-rbac-proxy/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.193969 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-sf7cf_1519e3af-34c9-4722-9aaa-8a10ef0d49de/manager/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.311083 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-p4btw_5348e551-de55-4c32-af1e-ac9facc061d9/kube-rbac-proxy/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.349087 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-p4btw_5348e551-de55-4c32-af1e-ac9facc061d9/manager/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.451426 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-4zsvx_27b8cb71-5bd2-4133-bf5a-db571521861b/kube-rbac-proxy/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.600587 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-4zsvx_27b8cb71-5bd2-4133-bf5a-db571521861b/manager/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.605858 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-prf5f_8ae60958-f755-47fd-891b-74356bff787c/manager/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.639961 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-prf5f_8ae60958-f755-47fd-891b-74356bff787c/kube-rbac-proxy/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.789632 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hsl94_67fd364b-d05e-4d57-a817-3f64be5cdba0/kube-rbac-proxy/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.839675 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hsl94_67fd364b-d05e-4d57-a817-3f64be5cdba0/manager/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.963291 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-6jtst_f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1/manager/0.log" Oct 09 11:32:52 crc kubenswrapper[4740]: I1009 11:32:52.992380 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-6jtst_f3e10f4d-eabe-4818-b34f-96dd2ba4d4a1/kube-rbac-proxy/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.078869 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-cz2dz_8313eb28-2711-404c-817c-b782ea1cf41a/kube-rbac-proxy/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.151790 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-cz2dz_8313eb28-2711-404c-817c-b782ea1cf41a/manager/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.218713 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-h4lw2_254d742d-881a-4ea9-97fd-2246d7109a77/kube-rbac-proxy/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.303077 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-h4lw2_254d742d-881a-4ea9-97fd-2246d7109a77/manager/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.374700 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-kv5jg_676e4e26-21ec-4b2c-ab3f-bc593cddfb33/kube-rbac-proxy/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.515725 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-kv5jg_676e4e26-21ec-4b2c-ab3f-bc593cddfb33/manager/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.577959 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-8nkmb_a9f75d3c-e107-48aa-b15b-442b785b8945/kube-rbac-proxy/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.609908 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-8nkmb_a9f75d3c-e107-48aa-b15b-442b785b8945/manager/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.759902 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx_3179f3c7-2f14-494b-9fea-3c217a11af2b/kube-rbac-proxy/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.810637 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dc2xsx_3179f3c7-2f14-494b-9fea-3c217a11af2b/manager/0.log" Oct 09 11:32:53 crc kubenswrapper[4740]: I1009 11:32:53.922543 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5647484f69-cxbqt_cd1eb7dc-bc88-4a8e-b681-751ebdf2089f/kube-rbac-proxy/0.log" Oct 09 11:32:54 crc kubenswrapper[4740]: I1009 11:32:54.069138 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6c6ccc6df6-wxp9l_e0e26b11-7270-46ec-9042-0eaab1e2a459/kube-rbac-proxy/0.log" Oct 09 11:32:54 crc kubenswrapper[4740]: I1009 11:32:54.266521 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6c6ccc6df6-wxp9l_e0e26b11-7270-46ec-9042-0eaab1e2a459/operator/0.log" Oct 09 11:32:54 crc kubenswrapper[4740]: I1009 11:32:54.303637 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gvdpg_aad20335-936b-4ec4-aced-424bf31edf74/registry-server/0.log" Oct 09 11:32:54 crc kubenswrapper[4740]: I1009 11:32:54.502275 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-r494n_7e7f599f-1cc9-41fc-b683-8b0de6e48761/kube-rbac-proxy/0.log" Oct 09 11:32:54 crc kubenswrapper[4740]: I1009 11:32:54.611667 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-r494n_7e7f599f-1cc9-41fc-b683-8b0de6e48761/manager/0.log" Oct 09 11:32:54 crc kubenswrapper[4740]: I1009 11:32:54.627223 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-q5tlk_f1909d9f-c6e3-4c55-93f5-be679e3c3792/kube-rbac-proxy/0.log" Oct 09 11:32:54 crc kubenswrapper[4740]: I1009 11:32:54.838938 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-c5n9g_00c4b19b-1c03-4fc2-9ac1-39ca45ca9570/operator/0.log" Oct 09 11:32:54 crc kubenswrapper[4740]: I1009 11:32:54.841368 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-q5tlk_f1909d9f-c6e3-4c55-93f5-be679e3c3792/manager/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.023561 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-gg5c4_1a6180f0-55bd-4c7e-a96c-97762cace534/kube-rbac-proxy/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.045534 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5647484f69-cxbqt_cd1eb7dc-bc88-4a8e-b681-751ebdf2089f/manager/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.081916 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-gg5c4_1a6180f0-55bd-4c7e-a96c-97762cace534/manager/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.109079 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-6fng6_594a8f26-8acc-44a8-b024-665012e570f6/kube-rbac-proxy/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.261189 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-6fng6_594a8f26-8acc-44a8-b024-665012e570f6/manager/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.295288 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-mxz8d_d454e0e1-1745-4fc0-aea1-9d231de7fa65/kube-rbac-proxy/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.298562 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-mxz8d_d454e0e1-1745-4fc0-aea1-9d231de7fa65/manager/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.420215 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-ngw9n_65d851bd-9407-48da-bac4-d3b07bab1d46/kube-rbac-proxy/0.log" Oct 09 11:32:55 crc kubenswrapper[4740]: I1009 11:32:55.462331 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-ngw9n_65d851bd-9407-48da-bac4-d3b07bab1d46/manager/0.log" Oct 09 11:33:11 crc kubenswrapper[4740]: I1009 11:33:11.069427 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mr7wc_91bc0d62-1ab0-4ca6-ad8d-ab99a4eea54b/control-plane-machine-set-operator/0.log" Oct 09 11:33:11 crc kubenswrapper[4740]: I1009 11:33:11.230054 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pxg57_2cfbd3fb-f7f5-4578-9e24-72dbd185cf12/kube-rbac-proxy/0.log" Oct 09 11:33:11 crc kubenswrapper[4740]: I1009 11:33:11.268509 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pxg57_2cfbd3fb-f7f5-4578-9e24-72dbd185cf12/machine-api-operator/0.log" Oct 09 11:33:13 crc kubenswrapper[4740]: I1009 11:33:13.812667 4740 scope.go:117] "RemoveContainer" containerID="0168a82f1393c2ad35f40ee0a56f7864d0b088d421b94bba222d28d277bb8a9e" Oct 09 11:33:23 crc kubenswrapper[4740]: I1009 11:33:23.206602 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vsq7z_9b5b17c6-4d72-4295-bb2b-436b65625a66/cert-manager-controller/0.log" Oct 09 11:33:23 crc kubenswrapper[4740]: I1009 11:33:23.378039 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-hlrtz_7d2a1d30-c83b-41ce-839e-3eb1f655a1c3/cert-manager-cainjector/0.log" Oct 09 11:33:23 crc kubenswrapper[4740]: I1009 11:33:23.408199 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-rdl5l_a9ffc41f-4710-469a-bae3-ae15d4eafd9b/cert-manager-webhook/0.log" Oct 09 11:33:36 crc kubenswrapper[4740]: I1009 11:33:36.433165 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-5x42v_edfeadfc-4f2b-4004-9a2d-98b6b8bbe448/nmstate-console-plugin/0.log" Oct 09 11:33:36 crc kubenswrapper[4740]: I1009 11:33:36.623198 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-f9gt2_9e19ffa9-5e2a-453c-8e50-6cb5e5c0732d/nmstate-handler/0.log" Oct 09 11:33:36 crc kubenswrapper[4740]: I1009 11:33:36.663276 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-fvr8l_eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd/kube-rbac-proxy/0.log" Oct 09 11:33:36 crc kubenswrapper[4740]: I1009 11:33:36.721487 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-fvr8l_eb19fbda-c268-4dce-9ef4-e2b69aaa8dfd/nmstate-metrics/0.log" Oct 09 11:33:36 crc kubenswrapper[4740]: I1009 11:33:36.835878 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-8f6tc_a286b66d-1660-424c-b244-d889a099262c/nmstate-operator/0.log" Oct 09 11:33:36 crc kubenswrapper[4740]: I1009 11:33:36.942235 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-bslx4_1f9567a2-9e5d-4996-a625-1bcaca30d9a9/nmstate-webhook/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.179113 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-8dwvb_e58ad5c0-164a-4ed4-b665-44068078198c/kube-rbac-proxy/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.321070 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-8dwvb_e58ad5c0-164a-4ed4-b665-44068078198c/controller/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.422306 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-frr-files/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.528834 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-frr-files/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.557143 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-reloader/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.595009 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-reloader/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.618537 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-metrics/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.834689 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-frr-files/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.871987 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-metrics/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.875122 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-reloader/0.log" Oct 09 11:33:51 crc kubenswrapper[4740]: I1009 11:33:51.887230 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-metrics/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.066779 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-metrics/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.080862 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-reloader/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.108433 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/cp-frr-files/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.110875 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/controller/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.251903 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/frr-metrics/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.305206 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/kube-rbac-proxy/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.372504 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/kube-rbac-proxy-frr/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.462173 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/reloader/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.603450 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-76cjs_01117083-3a07-4afc-b678-11b52fd9edea/frr-k8s-webhook-server/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.793973 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7698d6f4f4-hrnng_97758f55-d70a-4949-b056-5673a1975dd5/manager/0.log" Oct 09 11:33:52 crc kubenswrapper[4740]: I1009 11:33:52.973144 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6749d4b858-4656g_9f363b47-14ab-4719-93dd-db269dc8f132/webhook-server/0.log" Oct 09 11:33:53 crc kubenswrapper[4740]: I1009 11:33:53.038277 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qzfv2_3bb4221a-8b49-4183-a53b-6f81deafb446/kube-rbac-proxy/0.log" Oct 09 11:33:53 crc kubenswrapper[4740]: I1009 11:33:53.664253 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qzfv2_3bb4221a-8b49-4183-a53b-6f81deafb446/speaker/0.log" Oct 09 11:33:53 crc kubenswrapper[4740]: I1009 11:33:53.746390 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-svscr_66ca384b-ba80-4854-a999-bb78c9db0e6b/frr/0.log" Oct 09 11:34:06 crc kubenswrapper[4740]: I1009 11:34:06.971956 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/util/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.088536 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/pull/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.095294 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/pull/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.096223 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/util/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.261622 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/extract/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.279151 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/pull/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.356057 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2s9kzk_3544dc93-111d-4a49-90fc-92d76ad66184/util/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.419638 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-utilities/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.591519 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-content/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.593860 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-utilities/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.606021 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-content/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.766227 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-utilities/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.801302 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/extract-content/0.log" Oct 09 11:34:07 crc kubenswrapper[4740]: I1009 11:34:07.965371 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-utilities/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.172373 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mz9pq_7e490083-3e13-4528-b489-8a7555abb9be/registry-server/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.177600 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-utilities/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.178420 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-content/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.240662 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-content/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.357279 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-utilities/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.382363 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/extract-content/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.601241 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/util/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.808016 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/pull/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.813837 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/pull/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.891578 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/util/0.log" Oct 09 11:34:08 crc kubenswrapper[4740]: I1009 11:34:08.936059 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p8hx5_eb472a61-940e-48b4-be35-bec21b4eca3c/registry-server/0.log" Oct 09 11:34:09 crc kubenswrapper[4740]: I1009 11:34:09.444167 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/pull/0.log" Oct 09 11:34:09 crc kubenswrapper[4740]: I1009 11:34:09.468978 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/extract/0.log" Oct 09 11:34:09 crc kubenswrapper[4740]: I1009 11:34:09.473920 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4fgkh_5648ded7-a244-4850-ba02-14aa59ec31f1/util/0.log" Oct 09 11:34:09 crc kubenswrapper[4740]: I1009 11:34:09.609422 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zlsxx_26400837-285d-412b-944c-5b1fcb42b34f/marketplace-operator/0.log" Oct 09 11:34:09 crc kubenswrapper[4740]: I1009 11:34:09.677620 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-utilities/0.log" Oct 09 11:34:09 crc kubenswrapper[4740]: I1009 11:34:09.884514 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-content/0.log" Oct 09 11:34:09 crc kubenswrapper[4740]: I1009 11:34:09.898382 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-utilities/0.log" Oct 09 11:34:09 crc kubenswrapper[4740]: I1009 11:34:09.940740 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-content/0.log" Oct 09 11:34:10 crc kubenswrapper[4740]: I1009 11:34:10.071699 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-content/0.log" Oct 09 11:34:10 crc kubenswrapper[4740]: I1009 11:34:10.074019 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/extract-utilities/0.log" Oct 09 11:34:10 crc kubenswrapper[4740]: I1009 11:34:10.411117 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-utilities/0.log" Oct 09 11:34:10 crc kubenswrapper[4740]: I1009 11:34:10.468102 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cnjss_64ce5306-aa76-4f28-bb81-4a37bbb283e8/registry-server/0.log" Oct 09 11:34:10 crc kubenswrapper[4740]: I1009 11:34:10.608417 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-utilities/0.log" Oct 09 11:34:10 crc kubenswrapper[4740]: I1009 11:34:10.613830 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-content/0.log" Oct 09 11:34:10 crc kubenswrapper[4740]: I1009 11:34:10.616955 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-content/0.log" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.147359 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xdjqv"] Oct 09 11:34:11 crc kubenswrapper[4740]: E1009 11:34:11.148220 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8ed1fb-af00-41f3-87ed-5083c12b72e2" containerName="container-00" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.148244 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8ed1fb-af00-41f3-87ed-5083c12b72e2" containerName="container-00" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.148485 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8ed1fb-af00-41f3-87ed-5083c12b72e2" containerName="container-00" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.150212 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.162048 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdjqv"] Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.219820 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-catalog-content\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.220105 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpv49\" (UniqueName: \"kubernetes.io/projected/c957ea55-2931-42f0-821a-b762457fbd46-kube-api-access-mpv49\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.220168 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-utilities\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.227212 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-utilities/0.log" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.245253 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/extract-content/0.log" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.321542 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpv49\" (UniqueName: \"kubernetes.io/projected/c957ea55-2931-42f0-821a-b762457fbd46-kube-api-access-mpv49\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.321586 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-utilities\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.321691 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-catalog-content\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.322149 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-catalog-content\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.322209 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-utilities\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.352577 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpv49\" (UniqueName: \"kubernetes.io/projected/c957ea55-2931-42f0-821a-b762457fbd46-kube-api-access-mpv49\") pod \"certified-operators-xdjqv\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:11 crc kubenswrapper[4740]: I1009 11:34:11.473947 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:12 crc kubenswrapper[4740]: I1009 11:34:12.035216 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qktxn_310ad751-9417-4fb0-bdf2-d892a167b55f/registry-server/0.log" Oct 09 11:34:12 crc kubenswrapper[4740]: I1009 11:34:12.059612 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdjqv"] Oct 09 11:34:12 crc kubenswrapper[4740]: I1009 11:34:12.298130 4740 generic.go:334] "Generic (PLEG): container finished" podID="c957ea55-2931-42f0-821a-b762457fbd46" containerID="a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98" exitCode=0 Oct 09 11:34:12 crc kubenswrapper[4740]: I1009 11:34:12.298245 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdjqv" event={"ID":"c957ea55-2931-42f0-821a-b762457fbd46","Type":"ContainerDied","Data":"a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98"} Oct 09 11:34:12 crc kubenswrapper[4740]: I1009 11:34:12.298455 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdjqv" event={"ID":"c957ea55-2931-42f0-821a-b762457fbd46","Type":"ContainerStarted","Data":"b19fd1d1549a1f6b4c13e7dc860fbb5e6b123d2ea0c7235d138b19bdeef97d8d"} Oct 09 11:34:14 crc kubenswrapper[4740]: I1009 11:34:14.327174 4740 generic.go:334] "Generic (PLEG): container finished" podID="c957ea55-2931-42f0-821a-b762457fbd46" containerID="d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74" exitCode=0 Oct 09 11:34:14 crc kubenswrapper[4740]: I1009 11:34:14.327233 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdjqv" event={"ID":"c957ea55-2931-42f0-821a-b762457fbd46","Type":"ContainerDied","Data":"d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74"} Oct 09 11:34:15 crc kubenswrapper[4740]: I1009 11:34:15.337532 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdjqv" event={"ID":"c957ea55-2931-42f0-821a-b762457fbd46","Type":"ContainerStarted","Data":"62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3"} Oct 09 11:34:15 crc kubenswrapper[4740]: I1009 11:34:15.356289 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xdjqv" podStartSLOduration=1.914443677 podStartE2EDuration="4.356268746s" podCreationTimestamp="2025-10-09 11:34:11 +0000 UTC" firstStartedPulling="2025-10-09 11:34:12.300243545 +0000 UTC m=+3991.262443936" lastFinishedPulling="2025-10-09 11:34:14.742068624 +0000 UTC m=+3993.704269005" observedRunningTime="2025-10-09 11:34:15.353180533 +0000 UTC m=+3994.315380914" watchObservedRunningTime="2025-10-09 11:34:15.356268746 +0000 UTC m=+3994.318469127" Oct 09 11:34:21 crc kubenswrapper[4740]: I1009 11:34:21.474200 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:21 crc kubenswrapper[4740]: I1009 11:34:21.474920 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:21 crc kubenswrapper[4740]: I1009 11:34:21.536737 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:22 crc kubenswrapper[4740]: I1009 11:34:22.448712 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:22 crc kubenswrapper[4740]: I1009 11:34:22.491360 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdjqv"] Oct 09 11:34:24 crc kubenswrapper[4740]: I1009 11:34:24.418480 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xdjqv" podUID="c957ea55-2931-42f0-821a-b762457fbd46" containerName="registry-server" containerID="cri-o://62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3" gracePeriod=2 Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.086691 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.109303 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpv49\" (UniqueName: \"kubernetes.io/projected/c957ea55-2931-42f0-821a-b762457fbd46-kube-api-access-mpv49\") pod \"c957ea55-2931-42f0-821a-b762457fbd46\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.109368 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-catalog-content\") pod \"c957ea55-2931-42f0-821a-b762457fbd46\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.109387 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-utilities\") pod \"c957ea55-2931-42f0-821a-b762457fbd46\" (UID: \"c957ea55-2931-42f0-821a-b762457fbd46\") " Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.110908 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-utilities" (OuterVolumeSpecName: "utilities") pod "c957ea55-2931-42f0-821a-b762457fbd46" (UID: "c957ea55-2931-42f0-821a-b762457fbd46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.153033 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c957ea55-2931-42f0-821a-b762457fbd46" (UID: "c957ea55-2931-42f0-821a-b762457fbd46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.159661 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c957ea55-2931-42f0-821a-b762457fbd46-kube-api-access-mpv49" (OuterVolumeSpecName: "kube-api-access-mpv49") pod "c957ea55-2931-42f0-821a-b762457fbd46" (UID: "c957ea55-2931-42f0-821a-b762457fbd46"). InnerVolumeSpecName "kube-api-access-mpv49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.211579 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpv49\" (UniqueName: \"kubernetes.io/projected/c957ea55-2931-42f0-821a-b762457fbd46-kube-api-access-mpv49\") on node \"crc\" DevicePath \"\"" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.211630 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.211645 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c957ea55-2931-42f0-821a-b762457fbd46-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.433630 4740 generic.go:334] "Generic (PLEG): container finished" podID="c957ea55-2931-42f0-821a-b762457fbd46" containerID="62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3" exitCode=0 Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.433694 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdjqv" event={"ID":"c957ea55-2931-42f0-821a-b762457fbd46","Type":"ContainerDied","Data":"62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3"} Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.433715 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdjqv" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.433746 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdjqv" event={"ID":"c957ea55-2931-42f0-821a-b762457fbd46","Type":"ContainerDied","Data":"b19fd1d1549a1f6b4c13e7dc860fbb5e6b123d2ea0c7235d138b19bdeef97d8d"} Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.433809 4740 scope.go:117] "RemoveContainer" containerID="62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.461923 4740 scope.go:117] "RemoveContainer" containerID="d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.472256 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdjqv"] Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.485268 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xdjqv"] Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.491115 4740 scope.go:117] "RemoveContainer" containerID="a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.566956 4740 scope.go:117] "RemoveContainer" containerID="62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3" Oct 09 11:34:25 crc kubenswrapper[4740]: E1009 11:34:25.567520 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3\": container with ID starting with 62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3 not found: ID does not exist" containerID="62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.567553 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3"} err="failed to get container status \"62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3\": rpc error: code = NotFound desc = could not find container \"62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3\": container with ID starting with 62145c629074f561c3d1d69966022583c204baea89389b653126b57c9f2aa3f3 not found: ID does not exist" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.567573 4740 scope.go:117] "RemoveContainer" containerID="d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74" Oct 09 11:34:25 crc kubenswrapper[4740]: E1009 11:34:25.569413 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74\": container with ID starting with d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74 not found: ID does not exist" containerID="d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.569459 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74"} err="failed to get container status \"d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74\": rpc error: code = NotFound desc = could not find container \"d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74\": container with ID starting with d1081e3a37905ab16869ad55b742b1b1df8b9e962575935b60b0b294d5ba7f74 not found: ID does not exist" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.569487 4740 scope.go:117] "RemoveContainer" containerID="a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98" Oct 09 11:34:25 crc kubenswrapper[4740]: E1009 11:34:25.569990 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98\": container with ID starting with a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98 not found: ID does not exist" containerID="a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.570021 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98"} err="failed to get container status \"a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98\": rpc error: code = NotFound desc = could not find container \"a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98\": container with ID starting with a7775868c9e2c18abe3fe5061d1e7cb24a8b558dc23f745d81eb1850b23a6f98 not found: ID does not exist" Oct 09 11:34:25 crc kubenswrapper[4740]: I1009 11:34:25.764650 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c957ea55-2931-42f0-821a-b762457fbd46" path="/var/lib/kubelet/pods/c957ea55-2931-42f0-821a-b762457fbd46/volumes" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.225946 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x7h7z"] Oct 09 11:34:36 crc kubenswrapper[4740]: E1009 11:34:36.226773 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c957ea55-2931-42f0-821a-b762457fbd46" containerName="registry-server" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.226788 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c957ea55-2931-42f0-821a-b762457fbd46" containerName="registry-server" Oct 09 11:34:36 crc kubenswrapper[4740]: E1009 11:34:36.226837 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c957ea55-2931-42f0-821a-b762457fbd46" containerName="extract-utilities" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.226849 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c957ea55-2931-42f0-821a-b762457fbd46" containerName="extract-utilities" Oct 09 11:34:36 crc kubenswrapper[4740]: E1009 11:34:36.226874 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c957ea55-2931-42f0-821a-b762457fbd46" containerName="extract-content" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.226881 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c957ea55-2931-42f0-821a-b762457fbd46" containerName="extract-content" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.227077 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c957ea55-2931-42f0-821a-b762457fbd46" containerName="registry-server" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.228361 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.256364 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7h7z"] Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.316015 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-catalog-content\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.316173 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-utilities\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.316199 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md7fp\" (UniqueName: \"kubernetes.io/projected/535549fd-3f63-46b8-bb2b-8acec62d5c83-kube-api-access-md7fp\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.417554 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-utilities\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.417601 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md7fp\" (UniqueName: \"kubernetes.io/projected/535549fd-3f63-46b8-bb2b-8acec62d5c83-kube-api-access-md7fp\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.417653 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-catalog-content\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.418113 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-catalog-content\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.418115 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-utilities\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.435900 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md7fp\" (UniqueName: \"kubernetes.io/projected/535549fd-3f63-46b8-bb2b-8acec62d5c83-kube-api-access-md7fp\") pod \"community-operators-x7h7z\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:36 crc kubenswrapper[4740]: I1009 11:34:36.577897 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:37 crc kubenswrapper[4740]: I1009 11:34:37.181994 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7h7z"] Oct 09 11:34:37 crc kubenswrapper[4740]: I1009 11:34:37.539737 4740 generic.go:334] "Generic (PLEG): container finished" podID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerID="e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc" exitCode=0 Oct 09 11:34:37 crc kubenswrapper[4740]: I1009 11:34:37.539811 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7h7z" event={"ID":"535549fd-3f63-46b8-bb2b-8acec62d5c83","Type":"ContainerDied","Data":"e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc"} Oct 09 11:34:37 crc kubenswrapper[4740]: I1009 11:34:37.539860 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7h7z" event={"ID":"535549fd-3f63-46b8-bb2b-8acec62d5c83","Type":"ContainerStarted","Data":"e0d5e9772d8963e4574157d052e2f00fb496bba7bb1b25bcb3f208885d2e1420"} Oct 09 11:34:38 crc kubenswrapper[4740]: I1009 11:34:38.549380 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7h7z" event={"ID":"535549fd-3f63-46b8-bb2b-8acec62d5c83","Type":"ContainerStarted","Data":"cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db"} Oct 09 11:34:39 crc kubenswrapper[4740]: I1009 11:34:39.561102 4740 generic.go:334] "Generic (PLEG): container finished" podID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerID="cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db" exitCode=0 Oct 09 11:34:39 crc kubenswrapper[4740]: I1009 11:34:39.561148 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7h7z" event={"ID":"535549fd-3f63-46b8-bb2b-8acec62d5c83","Type":"ContainerDied","Data":"cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db"} Oct 09 11:34:41 crc kubenswrapper[4740]: I1009 11:34:41.582333 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7h7z" event={"ID":"535549fd-3f63-46b8-bb2b-8acec62d5c83","Type":"ContainerStarted","Data":"6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899"} Oct 09 11:34:41 crc kubenswrapper[4740]: I1009 11:34:41.612400 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x7h7z" podStartSLOduration=2.458023081 podStartE2EDuration="5.612384472s" podCreationTimestamp="2025-10-09 11:34:36 +0000 UTC" firstStartedPulling="2025-10-09 11:34:37.541685659 +0000 UTC m=+4016.503886040" lastFinishedPulling="2025-10-09 11:34:40.69604706 +0000 UTC m=+4019.658247431" observedRunningTime="2025-10-09 11:34:41.607480335 +0000 UTC m=+4020.569680746" watchObservedRunningTime="2025-10-09 11:34:41.612384472 +0000 UTC m=+4020.574584853" Oct 09 11:34:46 crc kubenswrapper[4740]: I1009 11:34:46.579062 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:46 crc kubenswrapper[4740]: I1009 11:34:46.580006 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:46 crc kubenswrapper[4740]: I1009 11:34:46.643983 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:46 crc kubenswrapper[4740]: I1009 11:34:46.694114 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:46 crc kubenswrapper[4740]: I1009 11:34:46.893188 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7h7z"] Oct 09 11:34:48 crc kubenswrapper[4740]: I1009 11:34:48.645112 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x7h7z" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerName="registry-server" containerID="cri-o://6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899" gracePeriod=2 Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.207550 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.263442 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-utilities\") pod \"535549fd-3f63-46b8-bb2b-8acec62d5c83\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.264102 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md7fp\" (UniqueName: \"kubernetes.io/projected/535549fd-3f63-46b8-bb2b-8acec62d5c83-kube-api-access-md7fp\") pod \"535549fd-3f63-46b8-bb2b-8acec62d5c83\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.264239 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-catalog-content\") pod \"535549fd-3f63-46b8-bb2b-8acec62d5c83\" (UID: \"535549fd-3f63-46b8-bb2b-8acec62d5c83\") " Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.264375 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-utilities" (OuterVolumeSpecName: "utilities") pod "535549fd-3f63-46b8-bb2b-8acec62d5c83" (UID: "535549fd-3f63-46b8-bb2b-8acec62d5c83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.264894 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.272739 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535549fd-3f63-46b8-bb2b-8acec62d5c83-kube-api-access-md7fp" (OuterVolumeSpecName: "kube-api-access-md7fp") pod "535549fd-3f63-46b8-bb2b-8acec62d5c83" (UID: "535549fd-3f63-46b8-bb2b-8acec62d5c83"). InnerVolumeSpecName "kube-api-access-md7fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.322778 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "535549fd-3f63-46b8-bb2b-8acec62d5c83" (UID: "535549fd-3f63-46b8-bb2b-8acec62d5c83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.366270 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md7fp\" (UniqueName: \"kubernetes.io/projected/535549fd-3f63-46b8-bb2b-8acec62d5c83-kube-api-access-md7fp\") on node \"crc\" DevicePath \"\"" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.366309 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535549fd-3f63-46b8-bb2b-8acec62d5c83-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.658595 4740 generic.go:334] "Generic (PLEG): container finished" podID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerID="6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899" exitCode=0 Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.658659 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7h7z" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.658675 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7h7z" event={"ID":"535549fd-3f63-46b8-bb2b-8acec62d5c83","Type":"ContainerDied","Data":"6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899"} Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.658737 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7h7z" event={"ID":"535549fd-3f63-46b8-bb2b-8acec62d5c83","Type":"ContainerDied","Data":"e0d5e9772d8963e4574157d052e2f00fb496bba7bb1b25bcb3f208885d2e1420"} Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.658791 4740 scope.go:117] "RemoveContainer" containerID="6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.722877 4740 scope.go:117] "RemoveContainer" containerID="cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db" Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.729273 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7h7z"] Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.741251 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x7h7z"] Oct 09 11:34:49 crc kubenswrapper[4740]: I1009 11:34:49.782080 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" path="/var/lib/kubelet/pods/535549fd-3f63-46b8-bb2b-8acec62d5c83/volumes" Oct 09 11:34:50 crc kubenswrapper[4740]: I1009 11:34:50.240916 4740 scope.go:117] "RemoveContainer" containerID="e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc" Oct 09 11:34:50 crc kubenswrapper[4740]: I1009 11:34:50.285057 4740 scope.go:117] "RemoveContainer" containerID="6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899" Oct 09 11:34:50 crc kubenswrapper[4740]: E1009 11:34:50.285603 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899\": container with ID starting with 6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899 not found: ID does not exist" containerID="6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899" Oct 09 11:34:50 crc kubenswrapper[4740]: I1009 11:34:50.285645 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899"} err="failed to get container status \"6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899\": rpc error: code = NotFound desc = could not find container \"6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899\": container with ID starting with 6e95fb33dd12eba5ca0d69f09937935576445728e8330489c65ea0f96a352899 not found: ID does not exist" Oct 09 11:34:50 crc kubenswrapper[4740]: I1009 11:34:50.285671 4740 scope.go:117] "RemoveContainer" containerID="cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db" Oct 09 11:34:50 crc kubenswrapper[4740]: E1009 11:34:50.286190 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db\": container with ID starting with cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db not found: ID does not exist" containerID="cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db" Oct 09 11:34:50 crc kubenswrapper[4740]: I1009 11:34:50.286273 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db"} err="failed to get container status \"cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db\": rpc error: code = NotFound desc = could not find container \"cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db\": container with ID starting with cf64c9199f444b09c04b9976cfd0111d6670364e40607f49371c7ae71d3a75db not found: ID does not exist" Oct 09 11:34:50 crc kubenswrapper[4740]: I1009 11:34:50.286299 4740 scope.go:117] "RemoveContainer" containerID="e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc" Oct 09 11:34:50 crc kubenswrapper[4740]: E1009 11:34:50.286698 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc\": container with ID starting with e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc not found: ID does not exist" containerID="e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc" Oct 09 11:34:50 crc kubenswrapper[4740]: I1009 11:34:50.286730 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc"} err="failed to get container status \"e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc\": rpc error: code = NotFound desc = could not find container \"e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc\": container with ID starting with e97595c94a189db45444deec22e523303e2e51b58561fac7a5e49a12342f43bc not found: ID does not exist" Oct 09 11:35:05 crc kubenswrapper[4740]: I1009 11:35:05.407227 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:35:05 crc kubenswrapper[4740]: I1009 11:35:05.408078 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:35:35 crc kubenswrapper[4740]: I1009 11:35:35.408000 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:35:35 crc kubenswrapper[4740]: I1009 11:35:35.409112 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:35:46 crc kubenswrapper[4740]: I1009 11:35:46.302258 4740 generic.go:334] "Generic (PLEG): container finished" podID="ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2" containerID="0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3" exitCode=0 Oct 09 11:35:46 crc kubenswrapper[4740]: I1009 11:35:46.302485 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6swms/must-gather-bqb9l" event={"ID":"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2","Type":"ContainerDied","Data":"0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3"} Oct 09 11:35:46 crc kubenswrapper[4740]: I1009 11:35:46.304340 4740 scope.go:117] "RemoveContainer" containerID="0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3" Oct 09 11:35:46 crc kubenswrapper[4740]: I1009 11:35:46.491734 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6swms_must-gather-bqb9l_ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2/gather/0.log" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.565927 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvcb"] Oct 09 11:35:47 crc kubenswrapper[4740]: E1009 11:35:47.566653 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerName="extract-content" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.566668 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerName="extract-content" Oct 09 11:35:47 crc kubenswrapper[4740]: E1009 11:35:47.566703 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerName="extract-utilities" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.566711 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerName="extract-utilities" Oct 09 11:35:47 crc kubenswrapper[4740]: E1009 11:35:47.566726 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerName="registry-server" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.566735 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerName="registry-server" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.566991 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="535549fd-3f63-46b8-bb2b-8acec62d5c83" containerName="registry-server" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.568577 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.577575 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvcb"] Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.702778 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-catalog-content\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.702934 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-utilities\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.703081 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxfrp\" (UniqueName: \"kubernetes.io/projected/4c56bc8a-e994-4465-9234-38aabe8411fa-kube-api-access-qxfrp\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.804636 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-catalog-content\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.804719 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-utilities\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.804765 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxfrp\" (UniqueName: \"kubernetes.io/projected/4c56bc8a-e994-4465-9234-38aabe8411fa-kube-api-access-qxfrp\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.805356 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-catalog-content\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.805570 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-utilities\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.825496 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxfrp\" (UniqueName: \"kubernetes.io/projected/4c56bc8a-e994-4465-9234-38aabe8411fa-kube-api-access-qxfrp\") pod \"redhat-marketplace-kmvcb\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:47 crc kubenswrapper[4740]: I1009 11:35:47.902788 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:48 crc kubenswrapper[4740]: I1009 11:35:48.350919 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvcb"] Oct 09 11:35:49 crc kubenswrapper[4740]: I1009 11:35:49.332209 4740 generic.go:334] "Generic (PLEG): container finished" podID="4c56bc8a-e994-4465-9234-38aabe8411fa" containerID="fae1b16a61ac302c56c16e3f843553628bb8962e495b78f77da48f3e3f3d502f" exitCode=0 Oct 09 11:35:49 crc kubenswrapper[4740]: I1009 11:35:49.332509 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvcb" event={"ID":"4c56bc8a-e994-4465-9234-38aabe8411fa","Type":"ContainerDied","Data":"fae1b16a61ac302c56c16e3f843553628bb8962e495b78f77da48f3e3f3d502f"} Oct 09 11:35:49 crc kubenswrapper[4740]: I1009 11:35:49.332534 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvcb" event={"ID":"4c56bc8a-e994-4465-9234-38aabe8411fa","Type":"ContainerStarted","Data":"5b236b41751b96b75b9d800640a59ba42f6b39d8cd0f582d1312b6d5d5e96806"} Oct 09 11:35:49 crc kubenswrapper[4740]: I1009 11:35:49.336090 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 11:35:50 crc kubenswrapper[4740]: I1009 11:35:50.344999 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvcb" event={"ID":"4c56bc8a-e994-4465-9234-38aabe8411fa","Type":"ContainerStarted","Data":"67fe00bd2ce889f8a80d1139762a80282e64fbfdd7d8bd854ed4850bc4126f9e"} Oct 09 11:35:50 crc kubenswrapper[4740]: E1009 11:35:50.670027 4740 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.176:39422->38.102.83.176:39657: write tcp 38.102.83.176:39422->38.102.83.176:39657: write: broken pipe Oct 09 11:35:51 crc kubenswrapper[4740]: I1009 11:35:51.357320 4740 generic.go:334] "Generic (PLEG): container finished" podID="4c56bc8a-e994-4465-9234-38aabe8411fa" containerID="67fe00bd2ce889f8a80d1139762a80282e64fbfdd7d8bd854ed4850bc4126f9e" exitCode=0 Oct 09 11:35:51 crc kubenswrapper[4740]: I1009 11:35:51.357425 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvcb" event={"ID":"4c56bc8a-e994-4465-9234-38aabe8411fa","Type":"ContainerDied","Data":"67fe00bd2ce889f8a80d1139762a80282e64fbfdd7d8bd854ed4850bc4126f9e"} Oct 09 11:35:52 crc kubenswrapper[4740]: I1009 11:35:52.370877 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvcb" event={"ID":"4c56bc8a-e994-4465-9234-38aabe8411fa","Type":"ContainerStarted","Data":"192567ca3766475d9a5eb008415f767de7bebebedf1753e92d2c2fe2e0591c5e"} Oct 09 11:35:52 crc kubenswrapper[4740]: I1009 11:35:52.398075 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kmvcb" podStartSLOduration=2.973180065 podStartE2EDuration="5.398047769s" podCreationTimestamp="2025-10-09 11:35:47 +0000 UTC" firstStartedPulling="2025-10-09 11:35:49.335740144 +0000 UTC m=+4088.297940525" lastFinishedPulling="2025-10-09 11:35:51.760607848 +0000 UTC m=+4090.722808229" observedRunningTime="2025-10-09 11:35:52.394606653 +0000 UTC m=+4091.356807044" watchObservedRunningTime="2025-10-09 11:35:52.398047769 +0000 UTC m=+4091.360248190" Oct 09 11:35:56 crc kubenswrapper[4740]: I1009 11:35:56.521692 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6swms/must-gather-bqb9l"] Oct 09 11:35:56 crc kubenswrapper[4740]: I1009 11:35:56.524156 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6swms/must-gather-bqb9l" podUID="ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2" containerName="copy" containerID="cri-o://5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4" gracePeriod=2 Oct 09 11:35:56 crc kubenswrapper[4740]: I1009 11:35:56.540687 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6swms/must-gather-bqb9l"] Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.116388 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6swms_must-gather-bqb9l_ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2/copy/0.log" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.116972 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.297930 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-must-gather-output\") pod \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\" (UID: \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\") " Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.298179 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5ntr\" (UniqueName: \"kubernetes.io/projected/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-kube-api-access-m5ntr\") pod \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\" (UID: \"ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2\") " Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.319981 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-kube-api-access-m5ntr" (OuterVolumeSpecName: "kube-api-access-m5ntr") pod "ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2" (UID: "ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2"). InnerVolumeSpecName "kube-api-access-m5ntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.399779 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5ntr\" (UniqueName: \"kubernetes.io/projected/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-kube-api-access-m5ntr\") on node \"crc\" DevicePath \"\"" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.418095 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6swms_must-gather-bqb9l_ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2/copy/0.log" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.418427 4740 generic.go:334] "Generic (PLEG): container finished" podID="ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2" containerID="5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4" exitCode=143 Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.418483 4740 scope.go:117] "RemoveContainer" containerID="5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.418499 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6swms/must-gather-bqb9l" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.436904 4740 scope.go:117] "RemoveContainer" containerID="0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.447568 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2" (UID: "ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.501641 4740 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.506838 4740 scope.go:117] "RemoveContainer" containerID="5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4" Oct 09 11:35:57 crc kubenswrapper[4740]: E1009 11:35:57.507355 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4\": container with ID starting with 5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4 not found: ID does not exist" containerID="5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.507401 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4"} err="failed to get container status \"5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4\": rpc error: code = NotFound desc = could not find container \"5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4\": container with ID starting with 5ba1edcfde37d08b6e1ed8d9d62104aac7483caf448ee72308fb0c9b8c61b2a4 not found: ID does not exist" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.507429 4740 scope.go:117] "RemoveContainer" containerID="0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3" Oct 09 11:35:57 crc kubenswrapper[4740]: E1009 11:35:57.508022 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3\": container with ID starting with 0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3 not found: ID does not exist" containerID="0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.508066 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3"} err="failed to get container status \"0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3\": rpc error: code = NotFound desc = could not find container \"0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3\": container with ID starting with 0568a33f5e7da95fecc806738bf271a04b228f33cea9525883cae55298d188b3 not found: ID does not exist" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.770160 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2" path="/var/lib/kubelet/pods/ae08dc49-aeff-4606-b1cb-5ddc6d2dc0e2/volumes" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.903838 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:57 crc kubenswrapper[4740]: I1009 11:35:57.904123 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:58 crc kubenswrapper[4740]: I1009 11:35:58.502183 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:59 crc kubenswrapper[4740]: I1009 11:35:59.481640 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:35:59 crc kubenswrapper[4740]: I1009 11:35:59.523886 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvcb"] Oct 09 11:36:01 crc kubenswrapper[4740]: I1009 11:36:01.452591 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kmvcb" podUID="4c56bc8a-e994-4465-9234-38aabe8411fa" containerName="registry-server" containerID="cri-o://192567ca3766475d9a5eb008415f767de7bebebedf1753e92d2c2fe2e0591c5e" gracePeriod=2 Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.468374 4740 generic.go:334] "Generic (PLEG): container finished" podID="4c56bc8a-e994-4465-9234-38aabe8411fa" containerID="192567ca3766475d9a5eb008415f767de7bebebedf1753e92d2c2fe2e0591c5e" exitCode=0 Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.469034 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvcb" event={"ID":"4c56bc8a-e994-4465-9234-38aabe8411fa","Type":"ContainerDied","Data":"192567ca3766475d9a5eb008415f767de7bebebedf1753e92d2c2fe2e0591c5e"} Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.676689 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.816099 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-utilities\") pod \"4c56bc8a-e994-4465-9234-38aabe8411fa\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.816289 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxfrp\" (UniqueName: \"kubernetes.io/projected/4c56bc8a-e994-4465-9234-38aabe8411fa-kube-api-access-qxfrp\") pod \"4c56bc8a-e994-4465-9234-38aabe8411fa\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.816409 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-catalog-content\") pod \"4c56bc8a-e994-4465-9234-38aabe8411fa\" (UID: \"4c56bc8a-e994-4465-9234-38aabe8411fa\") " Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.816808 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-utilities" (OuterVolumeSpecName: "utilities") pod "4c56bc8a-e994-4465-9234-38aabe8411fa" (UID: "4c56bc8a-e994-4465-9234-38aabe8411fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.816939 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.821964 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c56bc8a-e994-4465-9234-38aabe8411fa-kube-api-access-qxfrp" (OuterVolumeSpecName: "kube-api-access-qxfrp") pod "4c56bc8a-e994-4465-9234-38aabe8411fa" (UID: "4c56bc8a-e994-4465-9234-38aabe8411fa"). InnerVolumeSpecName "kube-api-access-qxfrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.828744 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c56bc8a-e994-4465-9234-38aabe8411fa" (UID: "4c56bc8a-e994-4465-9234-38aabe8411fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.919056 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxfrp\" (UniqueName: \"kubernetes.io/projected/4c56bc8a-e994-4465-9234-38aabe8411fa-kube-api-access-qxfrp\") on node \"crc\" DevicePath \"\"" Oct 09 11:36:02 crc kubenswrapper[4740]: I1009 11:36:02.919105 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c56bc8a-e994-4465-9234-38aabe8411fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 11:36:03 crc kubenswrapper[4740]: I1009 11:36:03.482962 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmvcb" event={"ID":"4c56bc8a-e994-4465-9234-38aabe8411fa","Type":"ContainerDied","Data":"5b236b41751b96b75b9d800640a59ba42f6b39d8cd0f582d1312b6d5d5e96806"} Oct 09 11:36:03 crc kubenswrapper[4740]: I1009 11:36:03.483027 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmvcb" Oct 09 11:36:03 crc kubenswrapper[4740]: I1009 11:36:03.483060 4740 scope.go:117] "RemoveContainer" containerID="192567ca3766475d9a5eb008415f767de7bebebedf1753e92d2c2fe2e0591c5e" Oct 09 11:36:03 crc kubenswrapper[4740]: I1009 11:36:03.517066 4740 scope.go:117] "RemoveContainer" containerID="67fe00bd2ce889f8a80d1139762a80282e64fbfdd7d8bd854ed4850bc4126f9e" Oct 09 11:36:03 crc kubenswrapper[4740]: I1009 11:36:03.524983 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvcb"] Oct 09 11:36:03 crc kubenswrapper[4740]: I1009 11:36:03.537788 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmvcb"] Oct 09 11:36:03 crc kubenswrapper[4740]: I1009 11:36:03.539258 4740 scope.go:117] "RemoveContainer" containerID="fae1b16a61ac302c56c16e3f843553628bb8962e495b78f77da48f3e3f3d502f" Oct 09 11:36:03 crc kubenswrapper[4740]: I1009 11:36:03.769107 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c56bc8a-e994-4465-9234-38aabe8411fa" path="/var/lib/kubelet/pods/4c56bc8a-e994-4465-9234-38aabe8411fa/volumes" Oct 09 11:36:05 crc kubenswrapper[4740]: I1009 11:36:05.407631 4740 patch_prober.go:28] interesting pod/machine-config-daemon-kdjch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 11:36:05 crc kubenswrapper[4740]: I1009 11:36:05.409915 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 11:36:05 crc kubenswrapper[4740]: I1009 11:36:05.409992 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" Oct 09 11:36:05 crc kubenswrapper[4740]: I1009 11:36:05.412543 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae51d440167983a5287ca7bb6e99bb81cf70c62b4ac5f857d15a24ed78f2d8d3"} pod="openshift-machine-config-operator/machine-config-daemon-kdjch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 11:36:05 crc kubenswrapper[4740]: I1009 11:36:05.412653 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" podUID="223b849a-db98-4f56-a649-9e144189950a" containerName="machine-config-daemon" containerID="cri-o://ae51d440167983a5287ca7bb6e99bb81cf70c62b4ac5f857d15a24ed78f2d8d3" gracePeriod=600 Oct 09 11:36:06 crc kubenswrapper[4740]: I1009 11:36:06.514187 4740 generic.go:334] "Generic (PLEG): container finished" podID="223b849a-db98-4f56-a649-9e144189950a" containerID="ae51d440167983a5287ca7bb6e99bb81cf70c62b4ac5f857d15a24ed78f2d8d3" exitCode=0 Oct 09 11:36:06 crc kubenswrapper[4740]: I1009 11:36:06.514252 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerDied","Data":"ae51d440167983a5287ca7bb6e99bb81cf70c62b4ac5f857d15a24ed78f2d8d3"} Oct 09 11:36:06 crc kubenswrapper[4740]: I1009 11:36:06.515161 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kdjch" event={"ID":"223b849a-db98-4f56-a649-9e144189950a","Type":"ContainerStarted","Data":"7e01b19ed6f0f2d4baa19900b98b9da9a9b848a2e26053c0ee6486c720ffe0f5"} Oct 09 11:36:06 crc kubenswrapper[4740]: I1009 11:36:06.515195 4740 scope.go:117] "RemoveContainer" containerID="222fd05088090023f2cf038bb3ee61d41f51624ca52e7d8d72668bd838882930"